Back • Return Home
Computer & Programming Basics
If you are reading this article, chances are high that you are doing so on an electronic device which is a type of "computer" (such as a "smartphone", "laptop", or "desktop"). In the most basic sense, a "computer" is something that "computes" (i.e.: does math). This doesn't tell us much about what is happening inside of that device though! While it might seem like a complicated subject, we will attempt to reduce it to something that is (hopefully) easy to understand. If any of it feels overwhelming, please take a step back and return to it later...
Contents
• What Is A Computer & What Does It Do?
• What Is Programming & How Is It Done?
• Basic Tools For Programming
• General Programming Concepts
• Good Practices When Programming
• Other Important Considerations
• Resources For Learning
• Conclusion
What Is A Computer & What Does It Do?
Generally, a computer is a machine that receives information (or "Input"), alters it according to some set of instructions, and then gives out a result (or "Output").
• "Input Devices" are things like a keyboard, mouse, digital camera, microphone, scanner, etc. They are all ways of receiving information.
• "Output Devices" are things like a monitor, speakers, printer, etc. They are all ways of giving out information.
In some cases, the thing which receives input also conveys output (e.g.: the touchscreen of a smartphone). Therefore, we will often encounter the term I/O (which is short for Input/Output). These devices are also sometimes called "Peripherals".
The main part of the computer that handles instructions is the "Central Processing Unit" (also known as the CPU or Processor for short). The information that it operates on is usually stored in one of two places:
1. The "Random Access Memory" (or RAM) is where instructions are held for immediate, short-term use
2. The "Hard Drive" is where information that has to be accessed less frequently is held for long-term use
Computers can also be used to transmit information to other computers. This is called a "Computer Network".
What Is Programming & How Is It Done?
"Hardware" refers to all of the physical devices which make up a computer, whereas a "software program" (also known as an "application" or "app") is a set of instructions that tells the hardware how to do some particular task. To write your own software is known as "programming" (or "coding"). In other words, we can get a computer to do pretty much whatever we want so long as we give it the proper set of instructions and the hardware is capable of carrying them out. It can be both a highly creative and logically-stimulating endeavor. The programmer Larry Wall gives the analogy of it being like writing a recipe. Let's explore how this is done...
All information within a computer is represented as numbers, zeros and ones, or "bits". This way of counting is called "Binary" (or "Base-2"). There is often so much information involved that it is helpful to represent it as larger numbers, usually groups of eight or sixteen. These ways of counting are called "Octal" (or "Base-8") and "Hexadecimal" (or "Base-16"), respectively. No matter what counting system we use, the representation of information as numbers like this is called "Machine Code".
To make the instructions easier for humans to understand, sometimes parts of this information will be represented as short words, abbreviations, or acronyms instead of only numbers. This is "Assembly". Different types of Processors will use a different form of Assembly.
Both Machine Code and Assembly are considered "Low-Level Langauges". By "low-level" we mean that the form that the instructions take is closer to how the hardware actually operates. To make it even easier to understand, and to be able to transfer instructions from one type of Processor to another, we can use a "High-Level Language". By "high-level" we mean that these ways of writing instructions are closer to something that a human would normally use to communicate (i.e.: "Natural langauges" like English or Mandarin Chinese).
There are many different High-Level Languages (such as C, Javascript, Python, R, and so on). Each has its own "syntax" or structure. We might choose to use one instead of another depending upon what we are trying to do, or what kind of hardware that we are working with. However, there are certain features and practices that are common to almost all of them. Again, this is similar to how "Natural languages" work. For example, English and Mandarin Chinese have very different structures in comparison to one another, but we could talk about the same subjects in either language.
The instructions written in High-Level Languages are called "Source Code". In order to work, Source Code must be "translated" into Machine Code through another piece of software. If that conversion process happens line-by-line, then that piece of software is called an "Intepreter". If it happens all in one chunk, then it is called a "Compiler". Interpreters and Compilers are simply sets of instructions for taking a set of instructions (Source Code) and transforming them into another set of instructions (Machine Code). 😄 Yikes! That might sound confusing...
In summary: High-Level Languages are instructions that are "human readable" (i.e.: easy for people to understand), but they must be turned into Low-Level Languages (i.e.: made "machine readable") in order to be used by a computer.
Basic Tools For Programming
While we could just type it out as a regular textfile, Source Code is often written within a piece of software called an "Integrated Development Environment" (or IDE). This software usually has extra tools that can make programming easier, such as:
• A "Linter", which checks the structure of the code; it might make suggestions as you type or "autocomplete" what you are typing
• A "Debugger", which checks the code for errors or "bugs" that keep it from working properly
• A "Runtime Environment", which allows one to test out code in "real-time" (i.e.: without having to exit the IDE and run the program separately)
• A "Virtual Environment", which allows one to group together all of the settings and extra files (or "Dependencies") that are necessary for the code to run
...etc.
Sometimes writing code takes a lot of extra research and testing, no matter how "fluent" one is in that language. Thankfully, not all instructions have to be written "from scratch". A "Library" or "Framework" is a collection of reusable bits of code for particular languages to carry out certain tasks. Some related terms include:
• "Software Development Kits" (or SDKs) - These are like Frameworks, but for a specific kind of hardware (e.g.: one type of smartphone, VR headset, videogame console, etc.).
• "Application Programming Interfaces" (or APIs) - These are ways to connect to some pre-existing infrastructure or service, or for two pieces of software to communicate with one another.
General Programming Concepts
In addition to getting a general idea of what each language is usually used for, one thing that I have found helpful is to get an understanding of some basic concepts that are applicable to most languages. For example:
In mathematics, a "Variable" is a way of showing that a whole range of different numbers is possible within an equation. It has a slightly different meaning in the context of programming. In programming, a "Variable" is a name that we choose to assign to some set of instructions. Assigning this name is known as "Initialization". We do this as a kind of shorthand. Anytime that we need that set of instructions, we don't have to type it all out again. We just refer to it by name.
Each programming language does Initialization differently, but it is a core concept within programming in general. Again, these ideas are very helpful to know! Here are a few questions that we can ask ourselves when we first encounter a new programming language to try to uncover them...
• Primitive Data Types
How are different types of numbers represented? Mathematically speaking, numbers can be either "Cardinal" (representing quantities), "Ordinal" (representing a place in a sequence - like "first", "second", "third", etc.), or "Nominal" (used as a name for something).
How do we create "Alphanumeric Strings" (sequences of letters and/or numbers)? How are these "Concatenated" or "Parsed" (i.e.: combined or separated)?
• Logical and Mathematical Operations
How do we add, subtract, multiply, and divide numbers? How are decimals or remainders handled?
How do we make "Conditional Statements" (e.g.: "true", "false", "if-then")?
How do we start, stop, and repeat a process (i.e.: "Loops", "For Loops", etc.)?
• Data Structures
What are the different ways that we can we organize and fetch the information that is stored within Memory (e.g.: what is the difference between "FIFO" and "LIFO")?
• Algorithms
How do we represent more complex step-by-step procedures by combining the above concepts? Getting into this kind of mindset is sometimes referred to as "algorithmic" or "computational thinking". A good way of developing it is to break down a process that we already know how to do. For example, what steps go into making a sandwich or brushing our teeth? People often have a tendency to take for granted all of information that is required to do a task, but a computer needs all information to be made explicit. Therefore, we must carefully consider how a complex task (or "main routine") can be broken down into a series of simplier tasks (or "subroutines"), and further, how they have to interrelate in order to produce a coherent result.
For example, it is not enough to give the command "put some toothpaste on a toothbrush" when describing how to brush one's teeth. We also have to specify that we need to remove the cap on the toothpaste tube first, and that we should only squeeze out a small amount of toothpaste onto the bristles of a wet toothbrush instead of the whole tube. In other words, we have to consider every fine detail that is involved within each process if we want to write instructions that can be followed by a computer.
Good Practices When Programming
Once these basics are understood, there are also some "good" practices to follow as we write code. Making it a habit to do these things from the outset will create a solid foundation for increasing one's skills in programming...
• "Pseudocode" is a general outline of what we need a program to do written in basic English (i.e.: what "features" will it have?). "Wireframing" is to draw out a flowchart of the logic that a program has to follow (i.e.: how will it carry out those actions?). If the software is going to have a Graphical User Interface (or GUI), then what types of things would people be clicking on? What do menus contain, what do buttons do, etc.? Draw a picture of what someone would see as they use the program.
It is important that we do this kind of planning before we start coding any detailed instructions in a particular language. This makes it clear what we need to know, and if we are writing code with other people, it helps everyone to be on the same page.
• "Commenting" is to type notes directly into our code that give brief, yet clear explanations of what it does. Most programming languages allow us to "comment out" sections of text so that the computer does not interpret them as instructions. Commenting is helpful for refreshing our memory if we have to leave our code alone for awhile. It also helps others to understand our code more easily if they have to look over it. Again, the keys are brevity and clarity. More thorough explanations should be put within their own separate textfile if necessary. This textfile would be known as "Documentation" or a "User Manual".
On a related note, there are also ways of writing the code itself that make it easier to read and to understand. To give a quick example: The name of a Variable should be short, but also give some idea of what it represents within the context of the program. Further, typing these names out in "Camel Case" (i.e.: with a mix of lowercase and uppercase letters) or by separating words with an "underscore" symbol, can make them easy to read at a glance without using spaces. easyToRead
or easy_to_read
, instead of noteasytoread
.
• "Refactoring" is to rewrite code without actually changing what it is doing. We might have to do this in order to make it easier to read and to maintain (i.e.: to add more code whenever necessary). Ideally, we want all code to be well-organized, structured into small chunks where it is simple to follow what each of them is doing. A tangled mess of instructions that are difficult to understand is known as "Spaghetti Code". As delicious as spaghetti is, that is something that we want to avoid as much as possible!
• "Version Control" is to periodically save the code as a separate file with a new name. Having different "versions" like this makes it easy to restore code if something gets lost (serving as a "backup"), or to revert back to a previously working copy if something no longer works for some reason (i.e.: for "troubleshooting" purposes). There are various pieces of software that can help with Version Control. Some keep track of code changes "locally" (i.e.: only on our computer), while others are "distributed" (i.e.: helping us to share or collaborate on code with other people through the Internet).
We cannot emphasize enough the importance of using a new name on each version, even if it is simply a larger number added to the end of it (e.g.: "version 1", "version 2", etc.). We have to be able to distinguish each file from one another! Another helpful practice is to put the version number as a comment at the very top of our code (in the "header"), along with a couple of words about what has changed since the previous version.
• "Optimization" is to make changes to code so that it runs more efficiently. By changing how something is done, sometimes we can do it faster or in a fewer number of steps. How do we determine this? Some IDEs have a "Profiler" that shows how a program is using the computer's resources when it is running.
There is also a bit of mathematics called "Big-O Notation" that is used when doing Optimization. We won't get into the technical details here, but essentially, Big-O Notation relates the size of a task to how long it takes to carry it out. It is a measure of the "complexity" of an Algorithm. We can use this information to compare Algorithms so that we can choose the one that would be most appropriate for the task at hand.
Other Important Considerations
Part of programming is not only considering how to do something, but also the circumstances in which a program will be used and why. Whether for reasons of safety or security, we may need to approach the writing of code in specific ways.
However, when we are first starting out, it is unlikely that we will have to worry much about those kinds of details. Instead, one might be wondering how much code they will have to write in order to make a program. Programs are often thought of in terms of how many "lines of code" long they are. Of course, this will vary depending on the type of program it is and what it does, and more lines of code does not necessarily mean that a program is more useful. Very useful programs can also be very simple. But it is helpful to look at the "codebases" of various apps to estimate how many "man-hours" probably went into their creation based upon their size.
In general, the average smartphone app is often somewhere between 40-50,000 lines of code. That might sound like a lot, but it is rougly equivalent to writing a novel. A single person is capable of doing it with some dedication and some good organizational skills. Anything larger than that would probably require a carefully coordinated team of people though.
Another important factor is knowing when to stop. Sometimes people try to get their programs to "do everything". This is called "feature creep" (i.e.: progressively adding more and more features to the point where basic functionality is compromised). As a "rule of thumb", it is usually better that a program does a specific task well, rather than being capable of doing many tasks poorly.
Resources For Learning
All of the jargon and acronyms associated with computers can be misleading, making it seem like a much more difficult subject than it actually is. We don't need to have a deep understanding of mathematics or of electronics to use a computer well or to be an effective programmer. The above concepts are more than enough to get started.
It is ok if you do not already know how to do "touch typing" (i.e.: using a keyboard without looking down at your fingers). Typing code is a little different from typing English. For example, code has a tendency to use fewer words and a lot more symbols. There are "typing tutor" websites to help one get more efficient at typing code, such as SpeedCoder.
A simple websearch can also bring up useful information on specific programming languages (e.g.: Python). Some languages are intended to be easy to use (e.g.: Basic).
There are a ton of great resources for learning programming in general, such as (in no particular order)...
• r/learnprogramming FAQ
• EBook Foundation - Free Programming Books
• freeCodeCamp
• Tech With Tim
• thenewboston
• Programming With Mosh
• Neso Academy
• CodeAcademy
• mycodeschool
• Code.org
• CodersLegacy
• Bro Code
• The Coding Train
...etc.
For those that would like a deeper understanding, be sure to look into the field of study known as "Computer Science". Computer Science goes deeper into why programming languages are structured in the ways that they are, and how the computer hardware that they control works. For example, we might learn to build simple electronic computers "from the ground-up", or we might design more sophisticated Processors with some fun training in tools like "Hardware Description Languages" (or HDLs).
There are a lot of useful websites dedicated to the topic of Computer Science (e.g.: Teach Yourself Computer Science, Geeks For Geeks, Open Source Society University, etc.), as well as many helpful resources that get into how computers work with varying levels of detail:
Books (Free ones are marked with -$)
• Douglas J. Alford - Computer Come-Froms: The Roots of Real-Time -$
• Douglas J. Alford & Pakaket Alford - Science Inside Computers: Flow, Glow, & Stow Data -$
• Ernie Dainow - A Concise History of Computers, Smartphones, and the Internet -$
• Ernie Dainow - Understanding Computers, Smartphones, and the Internet
• David Tarnoff - Computer Organization and Design Fundamentals -$
• Jonathan Bartlett - Programming From The Ground Up -$
• Tom Johnson - The Computer Science Book -$
• J. Clark Scott - But How Do It Know?: The Basic Principles of Computers for Everyone [Accompanying Videos]
• Charles Petzold - Code: The Hidden Language of Computer Hardware and Software [Borrow the eBook]
• Ron White - How Computers Work [Borrow the eBook]
• Jonathan E. Steinhart - The Secret Life of Programs: Understand Computers, Craft Better Code
• Matthew Justice - How Computers Really Work: A Hands-On Guide to the Inner Workings of the Machine
• Clive Maxfield - The Definitive Guide to How Computers Do Math
Videos
• H3Vtux - How Computers Work, Basics Explained
• Crash Course - Computer Science
• Fireship - 100+ Computer Science Concepts Explained
• Domain of Science - Map of Computer Science
• Brian Will
• Sebastian Lague - How Computers Work
• Improbable Matter - How a Computer Works: From Silicon to Apps
• New Mind - The Evolution Of CPU Processing Power
• Ben Eater - Building An 8-Bit Breadboard Computer
• James Sharman - Making An 8-Bit Pipelined CPU
• Forrest Knight - The Open Source Computer Science Degree
• The MIT OpenCourseWare Playlists have a ton of educational material for free, like...
+ Introduction to Computer Science and Programming (Fall 2008; Spring 2011)
+ Introduction to Computer Science and Programming in Python (Fall 2016)
+ Introduction to Electrical Engineering and Computer Science (Spring 2011)
...etc.
Computer Science and "Software Development" often overlap. Therefore, some of the general programming resources linked to above may have things related to Computer Science too (e.g.: freeCodeCamp's Introduction To Programming & Computer Science, the Harvard CS50 Course, etc.).
Conclusion
Computers are so ubiquitous that it can be helpful to develop some level of "computer literacy". If we take the time to learn some simple programming skills, we can get so much more out of our computer use. Computers are incredibly powerful machines. We sincerely hope that this article has assisted you in tapping into that power in a way that is interesting and enjoyable for you.
Thank you for reading! ❤️