Over the last two years I have conducted research into the use of Autonomy IDOL's and DRE's along with Softsound's technology to allow voice control of services including automatic transcription (Although I have started fancying Nuance as a better contender for the voice to text side of this 'kit') which then leads to a raft of services based on the input from the account holder. From letter writing just by talking to asking and getting answers... this is the automated switchboard that links profile match with profile match as with Autonomy's use regarding the Commonwealth Institute... and much much more... My question/s
If voice controlled services were available 2 years ago why is it that I am still stuck with keyboards or having to ask other people and hope they know the right answer?
Why can't I just talk to my phone and have my data stored and worked on so I can find out how much of head banger I was 3 months ago? Why can't I ask my phone a question and have this kit go off and find me the answer?
Why is voice such a little used part of this new society? Why is there no way to brainstorm on your own and get feedback unless you have more than just a simple phone? Talk control means All of the peoples of this earth could theoretically join in now rather than after they all learn writing and computers... why isn't it happening? Why is the ability to instantly respond to world situations so badly coordinated? This kit has designs that focus on the routes from a to b so that, in event of a disaster this kit becomes a coordination tool with multiple thousands/millions of real time coordinated inputs and outputs... Why isn't it here now? Is this promotional? I suppose it is in that I need this kit now and so do many many disabled people, carers, low income and even those of you who get to read this... Imagine talking and having things happen... I can't wait. help get it going soon please...
Please get me some answers to the above
Mark Aldiss projectbrainsaver
December 5, 2005 8:25 PM
December 7, 2005 3:43 PM