Computing Abstraction Information

Download .pdf, .docx, .epub, .txt
Did you like this example?
  • Discuss computing as a discipline and the role of abstraction in advances made in computing

Computing has been defined in various ways, including “the body of knowledge that surrounds computers and computation” used by the Computing Sciences Accreditation Board (Denning et al 1989), and “any goal-oriented activity requiring, benefiting from, or creating computers” (Shackelford et al 2006).

According to the Joint Task Force for Computing Curricula’s definition therefore, computing encompasses design and implementation of hardware and software systems for a wide range of purposes; processing, structuring, and managing various kinds of information; doing scientific studies using computers; making computer systems behave intelligently; creating and using communications and entertainment media; finding and gathering information relevant to any particular purpose, and so on.

By its nature, computing draws knowledge and skills from the fields of engineering, mathematics and science, from which the discipline itself is rooted.

Abstraction is a mental model which removes complex details, and leaves only the information necessary to accomplish the goal (Dale & Lewis, 2006). The concept is widely used in Electrical Engineering circuit analysis as Thevenin and Norton’s theorems for representing complex circuits as equivalent simple circuits.

Abstraction has also been key in the development of computing as it allowed innovations within individual layers of computing systems to be researched and developed independently of each other, e.g. the developments in Operating Systems designs and User Applications were all happening independent from advances in processor construction.

  • Discuss Unicode standards.

The Unicode standard is a universal means of character encoding, developed by the Unicode Consortium, that is used to represent every character in every language (Wikipedia).

The Unicode character set 16 bits per character, enabling it to represent over 65,000 characters (Dale & Lewis, 2006); a lot more than the ASCII set’s 256 characters, which were incorporated as a subset of the Unicode character set.

Each character is represented by a hexadecimal code, and the characters are classified under their source, e.g. ASCII/Latin, Thai, Greek, Chinese, etc.

The versatility of the standard has made it widely popular, and is used in many programming languages and computer systems.

Not all of the available codes in the standard have been assigned to characters, and although more than 30 writing systems are currently included, more are constantly being added.

The standard was first published in 1991, and has also been in constant revision since then; version 5.1 is expected during March 2008.

  • Who first thought about the idea of using the binary number system to be able to create electronic devices that can present numbers and do calculations?

The binary number system as we know and use in computing today, was first used in the early 20th century on early non-mechanical computers.

The word “bit”, short for “binary digit” was coined by John Wilder Tukey, an American statistician and early computer scientist,

Do you want to see the Full Version?

View full version

Having doubts about how to write your paper correctly?

Our editors will help you fix any mistakes and get an A+!

Get started
Leave your email and we will send a sample to you.
Thank you!

We will send an essay sample to you in 24 Hours. If you need help faster you can always use our custom writing service.

Get help with my paper
Sorry, but copying text is forbidden on this website. You can leave an email and we will send it to you.