Ya, they already do that with things like bitcoin mining and protein folding. The software basically downloads data, processes it, then uploads it to a central server.
I've heard of several other projects to link computers together for big processing tasks too, but the main problem is, it's not very efficient.
The same reason multi CPU systems didn't become mainstream for such a long time, having multiple processing systems takes effort to coordinate. A big chunk of data that can be processed in any order can make use of such a system efficiently, but most real world processing needs to be done in a specific order, which limits how much the data can be broken up and sent to multiple processors.
The most efficient way you can do that today is server hardware that will have 20+ CPU cores and tons of RAM and hard drives, but such machines are almost useless for everyday computing.