For most games you are best to use All cores for all games. This leaves it up to Windows to decide who needs what, and there is a good 30 years (real years, not this man year tripe) of design gone into the Windows scheduler, so any decision it makes about CPU time slice allocation is going to be better than anything you can come up with (you have to remember it also has a shitload more information available to it during execution, rather than some fixed decision you make up front). Anything you choose will always end up being limiting. On the very odd occasion a game might require you to limit the available cores, but this usually only applies to older games. The only reason I could think of for you to limit it these days would be if you were to allocate specific cores for stream encoding, and remove those from the games usage. In that case, then all the remaining cores should be allocated to all the games.

Other than that, others have already posted a few of these.
Don't use Wifi. Streaming will take a bunch of your bandwidth already, let alone the games. If you are using any kind of QOS, then chances are your games are suffering as video tends to take precedence.
Make sure you are using High Power settings in your power profile.
Don't set CPU affinity unless the above streaming setup applies (even then it is probably a bad idea).
Don't run all the games on high quality. Set the background ones to lower quality, disable buffering, and set background FPS lower.
Run some monitoring to see what is actually eating the CPU. Saying it goes to 100% is next to useless. Its a bit like saying my garage is full because there is stuff in it, so my car wont fit. If that stuff happens to be empty boxes, then I could easily remove them, or squash them flat. You on the other hand need to know WHAT is eating your CPU, in order to determine whether it is games/streaming/or something else that can be disabled/removed. I.e. you need to know where to focus, and knowing what software is using what resources helps you with that.