when you say "returning around 150 feeds", I guess you mean 150 items in the feed? else, if you fetch 150 feeds, I think I know why you encounter a limitation ;)
joke alone, I stumbled upon the same limitation (and others), so I just cut my input in 2. In your case you could maybe fetch the feed/... in reverse order in a second pipe? I can't do much without the pipe link.
indeed, you do fetch a lot of feeds. well you need to take your items 100 per 100 then. Basically, cut out the first (n-1)*100 items, with n as an input integer >=1.
I did it in this example: http://pipes.yahoo.com/luneart/6dd85aa7d5a7f87fc472f68cd66d6cfd
I also noticed you had 2 dead feeds from FT, I put an alternative using the general feed and a regex filter based on keywords (the initial (?i) means do not care about case, the (keyword1|keyword2|...) means the description field must contain either one of those keywords to show in feed).
I also added a filter by published time, so the first results are the latest news instead of just one feed after the other like before (latest in the sense of universal time, so don't worry about time-zones).
Also, be careful of your loop/fetchpage from cnbc: this module take 20s+ of runtime and yahoo pipes are limited to 30s of execution time for the whole pipe. It might cut you some content at some point...
actually I've got even better: as you mainly fetch already formatted feeds, keep an external list of urls so it's easy to update/add/remove etc, without having to go in the pipe source to do the modifications.
of course the ones that you have to operate on, as it is a particular processing each time, are still defined in the source.
check it out: http://pipes.yahoo.com/luneart/afaa6493fd92b0b7b48476138983f685
and with this list as an input: http://pastebin.com/858KpnHW