So here's an Idea and maybe someone with more knowledge of deep learning / neural networks can shed some light on whether this is feasible.
Couldn't we teach AI to distinguish kick from snare samples, hats from rides, one shots from loops etc, so we could have a really advanced sample browser where you just need to tell the program you need a kick and it will show you ALL kicks in your sample library? Maybe you could take this a step further even and make the AI tag samples with attributes like hard, soft, long decay, short decay, the amount of distortion, compression, noisiness and so on. Make it analyze freq values to tell you what note it is. Imagine just clicking on "Kick" selecting "d#" and "one shot" and find everything you have. The possibilities seem vast.
Would this be possible using deep learning AI?
Submitted October 01, 2017 at 06:48AM by Ommmmmmmmmmmmmm https://www.reddit.com/r/edmproduction/comments/73kt6j/idea_the_future_of_workflow/?utm_source=ifttt