I had the opportunity to answer a lot of computing questions for some people I know and I decided that the best way to do it was somewhere that I could point to in the future.
How does a computer know how to read and interpret information? How does it recognize new languages?
Computers read and interpret information using programs or, in the modern vernacular, applications. We can be a little more specific about this if we break the concept of information into instructions and data. Instructions are information that are used by computers in the context of a program. That is, a program is collection of instructions. On the other hand, data is information that is processed by instructions to accomplish whatever those instructions were written to do.
This means that in order for a computer to process information, it must have a program that was written specifically to read a particular type of data and do something with it. Modern programs know how to read many different types of data because they contain lots of different code to do that.
Computers do not “recognize” new languages in the sense that a human recognizes a friend or an old book. Instead, code in a given languages is compiled into a program – a set of instructions – or interpreted by another program as data. This second case may seem strange, but think about it from the perspective of a human reading a list of instructions from their boss. Some programs (“interpreters”) read a list of instructions as data and process it.
How did computing get started?
An excellent book on the history of computing is “Turing’s Cathedral” by George Dyson. The history is too rich for me to butcher it here.
Do computers “think?”
That really depends on your definition of thinking. If you define thinking as some mystical thing, then nope. However, if you define thinking as a processing data (read: stimuli) according to a set of instructions, then, sure, we might as well say computers think.
The much more important question is do computers realize they are thinking? The answer to that is no and it is what distinguishes humans from computers. This is called metacognition – We know we process information and to some degree we can choose what we think about.
What are the capability limits of computers? Is there anything they can’t do?
There are two answers to the first part. In a practical sense, computers cannot currently do many things that humans take for granted, like see things and recognize them or walk. Certainly there are specialized robots and other systems that can do something like it, but they don’t do it very well or in a way that we can really say describes some sort of general capability.
If we look at this from a theoretical perspective instead of a practical one, then the question becomes a lot more interesting. Computers can be programmed to do almost anything short of clairvoyance, so what they currently are not able to do practically is just a matter of time and more scientific research.
What does “API” mean?
API is an acronym for Application Programming Interface. It is the collection of functions that can be called to interact with a program or library.
All of the different languages essentially come from really creative people who need a language to perform a task that was not so easy to do with another language. Some languages were created by people who thought it would be fun or instructive. So while we probably don’t need as many as we have, it is probably a very good thing that we do.
Why do robots have an API?
Communication with robots is currently not possible in the same ways that humans communicate with each other or even the ways in which we communicate with our pets. Robots need APIs so that we can send messages in languages that they understand.
What do employers look for in software developers and engineers when hiring a prospective employee?
Skill, maturity, the ability to work with a team and a dedication to continued education tend to be at the top of the list of any employer, most especially in the software industry.
Are medical devices without fault? Are they ever inaccurate?
No, even medical devices have faults every now and again. It is rare because the rules to certify a medical device are very strict and its capabilities must meet certain standards, but it does happen.
Medical devices can, in some cases, be wildly inaccurate. Most of the time they are very accurate though because the requirements for their construction demand a very low error (amount of inaccuracy), typically much less than 1%.
Which area of computers pays the most and has the best job growth outlook?
Well, not science. 😛
Actually, science pays pretty well, but the best paid computing jobs are pretty much the ones that are in Silicon Valley. Typically these are jobs related to web-development for big companies, software services or entertainment.
The highest paid computing jobs I have ever heard of though are for build engineers. Build engineers are the software engineers who keep codes compiling, and they get paid to compile codes that would require miracles or human sacrifices on certain days of the year for us mere mortals to compile.