Facebook Twiter Goole Plus Linked In YouTube Blogger

Computers


Desktop Computer Computer is a machine for performing calculations automatically. An expert at calculation (or at operating calculating machines). Code

"A Keen Impassioned Beauty of a Great Machine"  "A Bicycle for the Brain"

Previous SubjectNext Subject

Everything about Computers

You can learn several different subjects at the same time when you're learning about computers. You can learn Problem Solving, Math, Languages, Communication, Technology, Electricity, Physics and Intelligence, just to name a few.

Basic Computer Skills - Computer Literacy - How Does a Computer Work? - Help

Computer Science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. It is the scientific and practical approach to computation and its applications and the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to information. An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.

Computer Science Books (wiki)
List of Computer Books (wiki)

Theoretical Computer Science is a division or subset of general computer science and mathematics that focuses on more abstract or mathematical aspects of computing and includes the theory of computation, which is the branch that deals with how efficiently problems can be solved on a model of computation, using an algorithm. The field is divided into three major branches: automata theory and language, computability theory, and computational complexity theory, which are linked by the question: "What are the fundamental capabilities and limitations of computers?".

Doctor of Computer Science is a doctorate in Computer Science by dissertation or multiple research papers.

Computer Movies


The Machine that Changed the World - Episode II - Inventing the Future (youtube)
HyperLand (youtube)
The Virtual Revolution (youtube)
Internet Rising (youtube)
The Code - Linux (film)
All Watched Over by Machines of Loving Grace (vimeo)
Kids Growing Up Online (PBS)

Charles Babbage was an English polymath. A mathematician, philosopher, inventor and mechanical engineer, Babbage is best remembered for originating the concept of a digital programmable computer. (26 December 1791 – 18 October 1871).

List of Pioneers in Computer Science
Great Inventions
Difference Engine (youtube)

Difference Engine is an automatic mechanical calculator designed to tabulate polynomial functions. The name derives from the method of divided differences, a way to interpolate or tabulate functions by using a small set of polynomial coefficients. Most mathematical functions commonly used by engineers, scientists and navigators, including logarithmic and trigonometric functions, can be approximated by polynomials, so a difference engine can compute many useful tables of numbers. The historical difficulty in producing error-free tables by teams of mathematicians and human "computers" spurred Charles Babbage's desire to build a mechanism to automate the process.

Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage's difference engine, a design for a mechanical computer. The Analytical Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. In other words, the logical structure of the Analytical Engine was essentially the same as that which has dominated computer design in the electronic era. Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding. It was not until the 1940s that the first general-purpose computers were actually built, more than a century after Babbage had proposed the pioneering Analytical Engine in 1837.

Computer History


Hardware


Hardware is the collection of physical components that constitute a computer system. Computer hardware is the physical parts or components of a computer, such as monitor, keyboard, computer data storage, hard disk drive (HDD), graphic card, sound card, memory (RAM), motherboard, and so on, all of which are tangible physical objects. By contrast, software is instructions that can be stored and run by hardware. Hardware is directed by the software to execute any command or instruction. A combination of hardware and software forms a usable computing system.

Hardware Architecture refers to the identification of a system's physical components and their interrelationships. This description, often called a hardware design model, allows hardware designers to understand how their components fit into a system architecture and provides to software component designers important information needed for software development and integration. Clear definition of a hardware architecture allows the various traditional engineering disciplines (e.g., electrical and mechanical engineering) to work more effectively together to develop and manufacture new machines, devices and components

Computer Architecture is a set of rules and methods that describe the functionality, organization, and implementation of computer systems. Some definitions of architecture define it as describing the capabilities and programming model of a computer but not a particular implementation. In other definitions computer architecture involves instruction set architecture design, microarchitecture design, logic design, and implementation.

Printed Circuit Board mechanically supports and electrically connects electronic components using conductive tracks, pads and other features etched from copper sheets laminated onto a non-conductive substrate. Components (e.g. capacitors, resistors or active devices) are generally soldered on the PCB. Advanced PCBs may contain components embedded in the substrate.

Circuit Board Components
Design

Integrated Circuit - IC

Computer Memory (amazon)
Internal Hard Drives (amazon)
Laptop Computers (amazon)
Desktop Computers (amazon)

Webopedia has definitions to words, phrases and abbreviations related to computing and information technology.

Computer Motherboard
Mother Board (image)

Motherboard is the main printed circuit board (PCB) found in general purpose microcomputers and other expandable systems. It holds and allows communication between many of the crucial electronic components of a system, such as the central processing unit (CPU) and memory, and provides connectors for other peripherals. Unlike a backplane, a motherboard usually contains significant sub-systems such as the central processor, the chipset's input/output and memory controllers, interface connectors, and other components integrated for general purpose use. Motherboard specifically refers to a PCB with expansion capability and as the name suggests, this board is often referred to as the "mother" of all components attached to it, which often include peripherals, interface cards, and daughtercards: sound cards, video cards, network cards, hard drives, or other forms of persistent storage; TV tuner cards, cards providing extra USB or FireWire slots and a variety of other custom components. Similarly, the term mainboard is applied to devices with a single board and no additional expansions or capability, such as controlling boards in laser printers, televisions, washing machines and other embedded systems with limited expansion abilities.

Circuit Board Components

Processor


Microprocessor accepts digital or binary data as input, processes it according to instructions stored in its memory, and provides results as output.

Central Processing Unit CPU carries out the instructions of a computer program by performing the basic arithmetic, logical, control and input/output (I/O) operations specified by the instructions.

Coprocessor is a computer processor used to supplement the functions of the primary processor (the CPU).

Multi-Core Processor can run multiple instructions at the same time, increasing overall speed for programs.

Multiprocessing is a computer system having two or more processing units (multiple processors) each sharing main memory and peripherals, in order to simultaneously process programs. It is the use of two or more central processing units (CPUs) within a single computer system. The term also refers to the ability of a system to support more than one processor or the ability to allocate tasks between them. There are many variations on this basic theme, and the definition of multiprocessing can vary with context, mostly as a function of how CPUs are defined (multiple cores on one die, multiple dies in one package, multiple packages in one system unit, etc.).

Multitasking - Batch Process

Process - Processing

Information Processor is a system (be it electrical, mechanical or biological) which takes information (a sequence of enumerated symbols or states) in one form and processes (transforms) it into another form, e.g. to statistics, by an algorithmic process. An information processing system is made up of four basic parts, or sub-systems: input, processor, storage, output.

Processor Affinity enables the binding and unbinding of a process or a thread to a central processing unit.

555 timer IC is an integrated circuit (chip) used in a variety of timer, pulse generation, and oscillator applications. The 555 can be used to provide time delays, as an oscillator, and as a flip-flop element. Derivatives provide two or four timing circuits in one package.

Semiconductor Design standard cell methodology is a method of designing application-specific integrated circuits (ASICs) with mostly digital-logic features.

BIOS an acronym for Basic Input/Output System and also known as the System BIOS, ROM BIOS or PC BIOS) is a type of firmware used to perform hardware initialization during the booting process (power-on startup) on IBM PC compatible computers, and to provide runtime services for operating systems and programs. The BIOS firmware is built into personal computers (PCs), and it is the first software they run when powered on. The name itself originates from the Basic Input/Output System used in the CP/M operating system in 1975. Originally proprietary to the IBM PC, the BIOS has been reverse engineered by companies looking to create compatible systems and the interface of that original system serves as a de facto standard.

Crystal Oscillator is an electronic oscillator circuit that uses the mechanical resonance of a vibrating crystal of piezoelectric material to create an electrical signal with a precise frequency.

Clock Speed typically refers to the frequency at which a chip like a central processing unit (CPU), one core of a multi-core processor, is running and is used as an indicator of the processor's speed. It is measured in clock cycles per second or its equivalent, the SI unit hertz (Hz). The clock rate of the first generation of computers was measured in hertz or kilohertz (kHz), but in the 21st century the speed of modern CPUs is commonly advertised in gigahertz (GHz). This metric is most useful when comparing processors within the same family, holding constant other features that may impact performance. Video card and CPU manufacturers commonly select their highest performing units from a manufacturing batch and set their maximum clock rate higher, fetching a higher price.

Silicon Photonics is the study and application of photonic systems which use silicon as an optical medium.

Analog Chip is a set of miniature electronic analog circuits formed on a single piece of semiconductor material.

Transistor is a semiconductor device used to amplify or switch electronic signals and electrical power. It is composed of semiconductor material usually with at least three terminals for connection to an external circuit. A voltage or current applied to one pair of the transistor's terminals controls the current through another pair of terminals. Because the controlled (output) power can be higher than the controlling (input) power, a transistor can amplify a signal. Today, some transistors are packaged individually, but many more are found embedded in integrated circuits.

Carbon Nanotube Field-effect Transistor refers to a field-effect transistor that utilizes a single carbon nanotube or an array of carbon nanotubes as the channel material instead of bulk silicon in the traditional MOSFET structure. First demonstrated in 1998, there have been major developments in CNTFETs since.

Analog Signal is any continuous signal for which the time varying feature (variable) of the signal is a representation of some other time varying quantity, i.e., analogous to another time varying signal. For example, in an analog audio signal, the instantaneous voltage of the signal varies continuously with the pressure of the sound waves. It differs from a digital signal, in which the continuous quantity is a representation of a sequence of discrete values which can only take on one of a finite number of values. The term analog signal usually refers to electrical signals; however, mechanical, pneumatic, hydraulic, human speech, and other systems may also convey or be considered analog signals. An analog signal uses some property of the medium to convey the signal's information. For example, an aneroid barometer uses rotary position as the signal to convey pressure information. In an electrical signal, the voltage, current, or frequency of the signal may be varied to represent the information.

Digital Signal is a signal that is constructed from a discrete set of waveforms of a physical quantity so as to represent a sequence of discrete values. A logic signal is a digital signal with only two possible values, and describes an arbitrary bit stream. Other types of digital signals can represent three-valued logic or higher valued logics.

Math Works
Nimbula
Learning Tools
Digikey Electronic Components
Nand 2 Tetris

Interfaces
Brain
Robots
3D Printing
PC Maintenance Tips
 

Software


Software is that part of a computer system that consists of encoded information or computer instructions, in contrast to the physical hardware from which the system is built. Operating System

Software Engineering is the application of engineering to the development of software in a systematic method. Typical formal definitions of Software Engineering are: Research, design, develop, and test operating systems-level software, compilers, and network distribution software for medical, industrial, military, communications, aerospace, business, scientific, and general computing applications. The systematic application of scientific and technological knowledge, methods, and experience to the design, implementation, testing, and documentation of software"; The application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software; An engineering discipline that is concerned with all aspects of software production; And the establishment and use of sound engineering principles in order to economically obtain software that is reliable and works efficiently on real machines.

Software Architecture refers to the high level structures of a software system, the discipline of creating such structures, and the documentation of these structures. These structures are needed to reason about the software system. Each structure comprises software elements, relations among them, and properties of both elements and relations. The architecture of a software system is a metaphor, analogous to the architecture of a building.

Software Development is the process of computer programming, documenting, testing, and bug fixing involved in creating and maintaining applications and frameworks resulting in a software product. Software development is a process of writing and maintaining the source code, but in a broader sense, it includes all that is involved between the conception of the desired software through to the final manifestation of the software, sometimes in a planned and structured process. Therefore, software development may include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products.

Software Developer is a person concerned with facets of the software development process, including the research, design, programming, and testing of computer software. Other job titles which are often used with similar meanings are programmer, software analyst, and software engineer. According to developer Eric Sink, the differences between system design, software development, and programming are more apparent. Already in the current market place there can be found a segregation between programmers and developers, being that one who implements is not the same as the one who designs the class structure or hierarchy. Even more so that developers become systems architects, those who design the multi-leveled architecture or component interactions of a large software system. (see also Debate over who is a software engineer).

Software Development Process is splitting of software development work into distinct phases (or stages) containing activities with the intent of better planning and management. It is often considered a subset of the systems development life cycle. The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application. Common methodologies include waterfall, prototyping, iterative and incremental development, spiral development, rapid application development, extreme programming and various types of agile methodology. Some people consider a life-cycle "model" a more general term for a category of methodologies and a software development "process" a more specific term to refer to a specific process chosen by a specific organization. For example, there are many specific software development processes that fit the spiral life-cycle model.

Agile Software Development describes a set of principles for software development under which requirements and solutions evolve through the collaborative effort of self-organizing cross-functional teams. It advocates adaptive planning, evolutionary development, early delivery, and continuous improvement, and it encourages rapid and flexible response to change. These principles support the definition and continuing evolution of many software development methods.

Software Release Life Cycle

Development Process - Develop Meaning

Software as a Service is a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted.

Computer Program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function, and typically executes the program's instructions in a central processing unit.

Computer Code

Instruction Set is the interface between a computer's software and its hardware, and thereby enables the independent development of these two computing realms; it defines the valid instructions that a machine may execute.

Computing Platform means in general sense, where any piece of software is executed. It may be the hardware or the operating system (OS), even a web browser or other application, as long as the code is executed in it. The term computing platform can refer to different abstraction levels, including a certain hardware architecture, an operating system (OS), and runtime libraries. In total it can be said to be the stage on which computer programs can run. A platform can be seen both as a constraint on the application development process, in that different platforms provide different functionality and restrictions; and as an assistance to the development process, in that they provide low-level functionality ready-made. For example, an OS may be a platform that abstracts the underlying differences in hardware and provides a generic command for saving files or accessing the network.

Scrum is an iterative and incremental agile software development framework for managing product development. It defines "a flexible, holistic product development strategy where a development team works as a unit to reach a common goal", challenges assumptions of the "traditional, sequential approach" to product development, and enables teams to self-organize by encouraging physical co-location or close online collaboration of all team members, as well as daily face-to-face communication among all team members and disciplines involved. A key principle of Scrum is its recognition that during product development, the customers can change their minds about what they want and need (often called requirements volatility), and that unpredicted challenges cannot be easily addressed in a traditional predictive or planned manner. As such, Scrum adopts an evidence-based empirical approach—accepting that the problem cannot be fully understood or defined, focusing instead on maximizing the team's ability to deliver quickly, to respond to emerging requirements and to adapt to evolving technologies and changes in market conditions.

Mobile Application Development is a term used to denote the act or process by which application software is developed for mobile devices, such as personal digital assistants, enterprise digital assistants or mobile phones. These applications can be pre-installed on phones during manufacturing platforms, or delivered as web applications using server-side or client-side processing (e.g., JavaScript) to provide an "application-like" experience within a Web browser. Application software developers also must consider a long array of screen sizes, hardware specifications, and configurations because of intense competition in mobile software and changes within each of the platforms. Mobile app development has been steadily growing, in revenues and jobs created. A 2013 analyst report estimates there are 529,000 direct app economy jobs within the EU 28 members, 60% of which are mobile app developers.

APPS (application software)

Software Testing is an investigation conducted to provide information about the quality of the product or service under test. Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation. Test techniques include the process of executing a program or application with the intent of finding software bugs (errors or other defects), and verifying that the software product is fit for use. Software testing involves the execution of a software component or system component to evaluate one or more properties of interest. In general, these properties indicate the extent to which the component or system under test: Meets the requirements that guided its design and development, responds correctly to all kinds of inputs, performs its functions within an acceptable time, is sufficiently usable, can be installed and run in its intended environments, and achieves the general result its stakeholders desire.

A/B Testing is a term for a randomized experiment with two variants, A and B, which are the control and variation in the controlled experiment. A/B testing is a form of statistical hypothesis testing with two variants leading to the technical term, two-sample hypothesis testing, used in the field of statistics.

Regression Testing is a type of software testing which verifies that software, which was previously developed and tested, still performs correctly after it was changed or interfaced with other software. Changes may include software enhancements, patches, configuration changes, etc. During regression testing, new software bugs or regressions may be uncovered. Sometimes a software change impact analysis is performed to determine what areas could be affected by the proposed changes. These areas may include functional and non-functional areas of the system.

Data-Driven Testing is a term used in the testing of computer software to describe testing done using a table of conditions directly as test inputs and verifiable outputs as well as the process where test environment settings and control are not hard-coded. In the simplest form the tester supplies the inputs from a row in the table and expects the outputs which occur in the same row. The table typically contains values which correspond to boundary or partition input spaces. In the control methodology, test configuration is "read" from a database.

Benchmark is the act of running a computer program, a set of programs, or other operations, in order to assess the relative performance of an object, normally by running a number of standard tests and trials against it. The term 'benchmark' is also mostly utilized for the purposes of elaborately designed benchmarking programs themselves.

Command Pattern is a behavioral design pattern in which an object is used to encapsulate all information needed to perform an action or trigger an event at a later time. This information includes the method name, the object that owns the method and values for the method parameters.

Iterative and incremental Development is any combination of both iterative design or iterative method and incremental build model for software development. The combination is of long standing and has been widely suggested for large development efforts. For example, the 1985 DOD-STD-2167 mentions (in section 4.1.2): "During software development, more than one iteration of the software development cycle may be in progress at the same time." and "This process may be described as an 'evolutionary acquisition' or 'incremental build' approach." The relationship between iterations and increments is determined by the overall software development methodology and software development process. The exact number and nature of the particular incremental builds and what is iterated will be specific to each individual development effort.

OSI Model is a conceptual model that characterizes and standardizes the communication functions of a telecommunication or computing system without regard to their underlying internal structure and technology. Its goal is the interoperability of diverse communication systems with standard protocols. The model partitions a communication system into abstraction layers. The original version of the model defined seven layers. A layer serves the layer above it and is served by the layer below it. For example, a layer that provides error-free communications across a network provides the path needed by applications above it, while it calls the next lower layer to send and receive packets that comprise the contents of that path. Two instances at the same layer are visualized as connected by a horizontal connection in that layer.

Technology Stack is a set of software subsystems or components needed to create a complete platform such that no additional software is needed to support applications. Applications are said to "run on" or "run on top of" the resulting platform. Some definitions of a platform overlap with what is known as system software.

Abstraction Layer is a way of hiding the implementation details of a particular set of functionality, allowing the separation of concerns to facilitate interoperability and platform independence. Software models that use layers of abstraction include the OSI 7-layer model for computer network protocols, the OpenGL graphics drawing library, and the byte stream input/output (I/O) model originated from Unix and adopted by DOS, Linux, and most other modern operating systems.

Open Systems Interconnection is an effort to standardize computer networking that was started in 1977 by the International Organization for Standardization (ISO), along with the ITU-T.

Enterprise Architecture Framework defines how to create and use an enterprise architecture. An architecture framework provides principles and practices for creating and using the architecture description of a system. It structures architects' thinking by dividing the architecture description into domains, layers or views, and offers models - typically matrices and diagrams - for documenting each view. This allows for making systemic design decisions on all the components of the system and making long-term decisions around new design, requirements, sustainability and support.

Enterprise Architecture is "a well-defined practice for conducting enterprise analysis, design, planning, and implementation, using a holistic approach at all times, for the successful development and execution of strategy. Enterprise architecture applies architecture principles and practices to guide organizations through the business, information, process, and technology changes necessary to execute their strategies. These practices utilize the various aspects of an enterprise to identify, motivate, and achieve these changes.

International Organization for Standardization is an international standard-setting body composed of representatives from various national standards organizations.

Conceptual Model is a representation of a system, made of the composition of concepts which are used to help people know, understand, or simulate a subject the model represents. Some models are physical objects; for example, a toy model which may be assembled, and may be made to work like the object it represents.

Model-Driven Engineering is a software development methodology that focuses on creating and exploiting domain models, which are conceptual models of all the topics related to a specific problem. Hence, it highlights and aims at abstract representations of the knowledge and activities that govern a particular application domain, rather than the computing (f.e. algorithmic) concepts.

Model-Based Design is a mathematical and visual method of addressing problems associated with designing complex control,signal processing and communication systems. It is used in many motion control, industrial equipment, aerospace, and automotive applications. Model-based design is a methodology applied in designing embedded software.

Architectural Pattern is a general, reusable solution to a commonly occurring problem in software architecture within a given context. Architectural patterns are similar to software design pattern but have a broader scope. The architectural patterns address various issues in software engineering, such as computer hardware performance limitations, high availability and minimization of a business risk. Some architectural patterns have been implemented within software frameworks.

Software Design Pattern is a general reusable solution to a commonly occurring problem within a given context in software design. It is not a finished design that can be transformed directly into source or machine code. It is a description or template for how to solve a problem that can be used in many different situations. Design patterns are formalized best practices that the programmer can use to solve common problems when designing an application or system.

Resource-Oriented Architecture is a style of software architecture and programming paradigm for designing and developing software in the form of resources with "RESTful" interfaces. These resources are software components (discrete pieces of code and/or data structures) which can be reused for different purposes. ROA design principles and guidelines are used during the phases of software development and system integration.

Representational State Transfer or RESTful Web services are one way of providing interoperability between computer systems on the Internet. REST-compliant Web services allow requesting systems to access and manipulate textual representations of Web resources using a uniform and predefined set of stateless operations. Other forms of Web service exist, which expose their own arbitrary sets of operations such as WSDL and SOAP. (REST)

Software Configuration Management is the task of tracking and controlling changes in the software, part of the larger cross-disciplinary field of configuration management. SCM practices include revision control and the establishment of baselines. If something goes wrong, SCM can determine what was changed and who changed it. If a configuration is working well, SCM can determine how to replicate it across many hosts.

Cucumber is a software tool that computer programmers use for testing other software.

Selenium

Apache Maven
Jwebunit software is a Java-based testing framework for web applications

Apache JMeter is an Apache project that can be used as a load testing tool for analyzing and measuring the performance of a variety of services, with a focus on web applications.

Free Software
Structure
Interfaces
Matrix
Communications Protocol
Data
Learn to Code


Computing Types


Bio-inspired Computing is a field of study that loosely knits together subfields related to the topics of connectionism, social behaviour and emergence. It is often closely related to the field of artificial intelligence, as many of its pursuits can be linked to machine learning. It relies heavily on the fields of biology, computer science and mathematics. Briefly put, it is the use of computers to model the living phenomena, and simultaneously the study of life to improve the usage of computers. Biologically inspired computing is a major subset of natural computation.

Biological Computation is the study of the computations performed by natural biota, including the subject matter of systems biology. The design of algorithms inspired by the computational methods of biota. The design and engineering of manufactured computational devices using synthetic biology components. Computer methods for the analysis of biological data, elsewhere called computational biology. When biological computation refers to using biology to build computers, it is a subfield of computer science and is distinct from the interdisciplinary science of bioinformatics which simply uses computers to better understand biology.

Computational Biology involves the development and application of data-analytical and theoretical methods, mathematical modeling and computational simulation techniques to the study of biological, behavioral, and social systems. The field is broadly defined and includes foundations in computer science, applied mathematics, animation, statistics, biochemistry, chemistry, biophysics, molecular biology, genetics, genomics, ecology, evolution, anatomy, neuroscience, and visualization. Computational biology is different from biological computation, which is a subfield of computer science and computer engineering using bioengineering and biology to build computers, but is similar to bioinformatics, which is an interdisciplinary science using computers to store and process biological data. Information

Model of Computation is the definition of the set of allowable operations used in computation and their respective costs. It is used for measuring the complexity of an algorithm in execution time and or memory space: by assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.

Computer Simulation - Virtual Reality

Ubiquitous Computing is a concept in software engineering and computer science where computing is made to appear anytime and everywhere. In contrast to desktop computing, ubiquitous computing can occur using any device, in any location, and in any format. A user interacts with the computer, which can exist in many different forms, including laptop computers, tablets and terminals in everyday objects such as a fridge or a pair of glasses. The underlying technologies to support ubiquitous computing include Internet, advanced middleware, operating system, mobile code, sensors, microprocessors, new I/O and user interfaces, networks, mobile protocols, location and positioning and new materials.

Parallel Computing is a type of computation in which many calculations or the execution of processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different forms of parallel computing: bit-level, instruction-level, data, and task parallelism. Parallelism has been employed for many years, mainly in high-performance computing, but interest in it has grown lately due to the physical constraints preventing frequency scaling. As power consumption (and consequently heat generation) by computers has become a concern in recent years, parallel computing has become the dominant paradigm in computer architecture, mainly in the form of multi-core processors.

Task Parallelism is a form of parallelization of computer code across multiple processors in parallel computing environments. Task parallelism focuses on distributing tasks—concurrently performed by processes or threads—across different processors. It contrasts to data parallelism as another form of parallelism.

Human Brain Parallel Processing

Human Centered Computing studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

Cloud Computing is a type of Internet-based computing that provides shared computer processing resources and data to computers and other devices on demand. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources (e.g., computer networks, servers, storage, applications and services), which can be rapidly provisioned and released with minimal management effort. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in either privately owned, or third-party data centers that may be located far from the user–ranging in distance from across a city to across the world. Cloud computing relies on sharing of resources to achieve coherence and economy of scale, similar to a utility (like the electricity grid) over an electricity network.
Cloud Computing Tools

Reversible Computing is a model of computing where the computational process to some extent is reversible, i.e., time-invertible. In a computational model that uses deterministic transitions from one state of the abstract machine to another, a necessary condition for reversibility is that the relation of the mapping from states to their successors must be one-to-one. Reversible computing is generally considered an unconventional form of computing.

Adaptable - Compatible

Natural Computing is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials (e.g., molecules) to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

DNA Computing is a branch of computing which uses DNA, biochemistry, and molecular biology hardware, instead of the traditional silicon-based computer technologies. Research and development in this area concerns theory, experiments, and applications of DNA computing. The term "molectronics" has sometimes been used, but this term had already been used for an earlier technology, a then-unsuccessful rival of the first integrated circuits; this term has also been used more generally, for molecular-scale electronic technology.


Digital Displays


Digital Signage is a sub segment of signage. Digital signages use technologies such as LCD, LED and Projection to display content such as digital images, video, streaming media, and information. They can be found in public spaces, transportation systems, museums, stadiums, retail stores, hotels, restaurants, and corporate buildings etc., to provide wayfinding, exhibitions, marketing and outdoor advertising. Digital Signage market is expected to grow from USD $15 billion to over USD $24bn by 2020.

Colors - Eyes (sight)

LED Display is a flat panel display, which uses an array of light-emitting diodes as pixels for a video display. Their brightness allows them to be used outdoors in store signs and billboards, and in recent years they have also become commonly used in destination signs on public transport vehicles. LED displays are capable of providing general illumination in addition to visual display, as when used for stage lighting or other decorative (as opposed to informational) purposes.

Organic Light-Emitting Diode (OLED) is a light-emitting diode (LED) in which the emissive electroluminescent layer is a film of organic compound that emits light in response to an electric current. This layer of organic semiconductor is situated between two electrodes; typically, at least one of these electrodes is transparent. OLEDs are used to create digital displays in devices such as television screens, computer monitors, portable systems such as mobile phones, handheld game consoles and PDAs. A major area of research is the development of white OLED devices for use in solid-state lighting applications.

AMOLED is a display technology used in smartwatches, mobile devices, laptops, and televisions. OLED describes a specific type of thin-film-display technology in which organic compounds form the electroluminescent material, and active matrix refers to the technology behind the addressing of pixels.

High-Dynamic-Range Imaging is a high dynamic range (HDR) technique used in imaging and photography to reproduce a greater dynamic range of luminosity than is possible with standard digital imaging or photographic techniques. The aim is to present a similar range of luminance to that experienced through the human visual system. The human eye, through adaptation of the iris and other methods, adjusts constantly to adapt to a broad range of luminance present in the environment. The brain continuously interprets this information so that a viewer can see in a wide range of light conditions.

Graphics Display Resolution is the width and height dimensions of an electronic visual display device, such as a computer monitor, in pixels. Certain combinations of width and height are standardized and typically given a name and an initialism that is descriptive of its dimensions. A higher display resolution in a display of the same size means that displayed content appears sharper.

4K Resolution refers to a horizontal resolution on the order of 4,000 pixels and vertical resolution on the order
of 2,000 pixels.

Smartphones

Computer Monitor is an electronic visual display for computers. A monitor usually comprises the display device, circuitry, casing, and power supply. The display device in modern monitors is typically a thin film transistor liquid crystal display (TFT-LCD) or a flat panel LED display, while older monitors used a cathode ray tubes (CRT). It can be connected to the computer via VGA, DVI, HDMI, DisplayPort, Thunderbolt, LVDS (Low-voltage differential signaling) or other proprietary connectors and signals.

Durable Monitor Screens (computers)

Liquid-Crystal Display is a flat-panel display or other electronically modulated optical device that uses the light-modulating properties of liquid crystals. Liquid crystals do not emit light directly, instead using a backlight or reflector to produce images in color or monochrome. LCDs are available to display arbitrary images (as in a general-purpose computer display) or fixed images with low information content, which can be displayed or hidden, such as preset words, digits, and 7-segment displays, as in a digital clock. They use the same basic technology, except that arbitrary images are made up of a large number of small pixels, while other displays have larger elements. LCDs are used in a wide range of applications including computer monitors, televisions, instrument panels, aircraft cockpit displays, and indoor and outdoor signage. Small LCD screens are common in portable consumer devices such as digital cameras, watches, calculators, and mobile telephones, including smartphones. LCD screens are also used on consumer electronics products such as DVD players, video game devices and clocks. LCD screens have replaced heavy, bulky cathode ray tube (CRT) displays in nearly all applications. LCD screens are available in a wider range of screen sizes than CRT and plasma displays, with LCD screens available in sizes ranging from tiny digital watches to huge, big-screen television sets. Since LCD screens do not use phosphors, they do not suffer image burn-in when a static image is displayed on a screen for a long time (e.g., the table frame for an aircraft schedule on an indoor sign). LCDs are, however, susceptible to image persistence. The LCD screen is more energy-efficient and can be disposed of more safely than a CRT can. Its low electrical power consumption enables it to be used in battery-powered electronic equipment more efficiently than CRTs can be. By 2008, annual sales of televisions with LCD screens exceeded sales of CRT units worldwide, and the CRT became obsolete for most purposes.

Touchscreen is a input and output device normally layered on the top of an electronic visual display of an information processing system. A user can give input or control the information processing system through simple or multi-touch gestures by touching the screen with a special stylus and/or one or more fingers. Some touchscreens use ordinary or specially coated gloves to work while others may only work using a special stylus/pen. The user can use the touchscreen to react to what is displayed and to control how it is displayed; for example, zooming to increase the text size. The touchscreen enables the user to interact directly with what is displayed, rather than using a mouse, touchpad, or any other such device (other than a stylus, which is optional for most modern touchscreens). Touchscreens are common in devices such as game consoles, personal computers, tablet computers, electronic voting machines, point of sale systems ,and smartphones. They can also be attached to computers or, as terminals, to networks. They also play a prominent role in the design of digital appliances such as personal digital assistants (PDAs) and some e-readers.

Interfaces

7-Segment Display
9-Segment Display
14-Segment Display

Operating Systems
Code - Programing
Computer Courses
Online Dictionary of Computer - Technology Terms
Variable (cs)
CS Unplugged
Computer Science without using computers
Technology Education
Robots
Engineering
Technology Addiction
Technical Competitions
Internet

Computer Standards
IPv6 recent version of the Internet Protocol
IPv6
Web 2.0

Trouble-Shoot PC's
Fixing PC's
 

Networks


Computer Network is a telecommunications network which allows nodes to share resources. In computer networks, networked computing devices exchange data with each other using a data link. The connections between nodes are established using either cable media or wireless media. The best-known computer network is the Internet.

Server is a computer program or a device that provides functionality for other programs or devices, called "clients". This architecture is called the client–server model, and a single overall computation is distributed across multiple processes or devices. Servers can provide various functionalities, often called "services", such as sharing data or resources among multiple clients, or performing computation for a client. A single server can serve multiple clients, and a single client can use multiple servers. A client process may run on the same device or may connect over a network to a server on a different device. Typical servers are Database Servers, file servers, mail servers, print servers, web servers, game servers, and application servers.

Distributed Computing is a field of computer science that studies distributed systems. A distributed system is a model in which components located on networked computers communicate and coordinate their actions by passing messages. The components interact with each other in order to achieve a common goal. Three significant characteristics of distributed systems are: concurrency of components, lack of a global clock, and independent failure of components. Examples of distributed systems vary from SOA-based systems to massively multiplayer online games to peer-to-peer applications.

Proxy Server is a server (a computer system or an application) that acts as an intermediary for requests from clients seeking resources from other servers. A client connects to the proxy server, requesting some service, such as a file, connection, web page, or other resource available from a different server and the proxy server evaluates the request as a way to simplify and control its complexity. Proxies were invented to add structure and encapsulation to distributed systems. Today, most proxies are web proxies, facilitating access to content on the World Wide Web, providing anonymity and may be used to bypass IP address blocking.

Automated Server Infrastructures

Network Science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors represented by nodes (or vertices) and the connections between the elements or actors as links (or edges). The field draws on theories and methods including graph theory from mathematics, statistical mechanics from physics, data mining and information visualization from computer science, inferential modeling from statistics, and social structure from sociology. The United States National Research Council defines network science as "the study of network representations of physical, biological, and social phenomena leading to predictive models of these phenomena.

Network Science
Network Science
Network Cultures
Coreos
Omega
Mesos

Network Theory is the study of graphs as a representation of either symmetric relations or, more generally, of asymmetric relations between discrete objects. In computer science and network science, network theory is a part of graph theory: a network can be defined as a graph in which nodes and/or edges have attributes (e.g. names).

Network Monitoring is the use of a system that constantly monitors a computer network for slow or failing components and that notifies the network administrator (via email, SMS or other alarms) in case of outages or other trouble. Network monitoring is part of network management.

Network Management is the process of administering and managing the computer networks of one or many organisations. Various services provided by network managers include fault analysis, performance management, provisioning of network and network devices, maintaining the quality of service, and so on. Software that enables network administrators or network managers to perform their functions is called network management software.

Network Packet is a formatted unit of data carried by a packet-switched network. Computer communications links that do not support packets, such as traditional point-to-point telecommunications links, simply transmit data as a bit stream. When data is formatted into packets, packet switching is possible and the bandwidth of the communication medium can be better shared among users than with circuit switching.

Cluster Manager usually is a backend graphical user interface (GUI) or command-line software that runs on one or all cluster nodes (in some cases it runs on a different server or cluster of management servers.) The cluster manager works together with a cluster management agent. These agents run on each node of the cluster to manage and configure services, a set of services, or to manage and configure the complete cluster server itself (see super computing.) In some cases the cluster manager is mostly used to dispatch work for the cluster (or cloud) to perform. In this last case a subset of the cluster manager can be a remote desktop application that is used not for configuration but just to send work and get back work results from a cluster. In other cases the cluster is more related to availability and load balancing than to computational or specific service clusters.

Node - Ai

Network Administrator maintains computer infrastructures with emphasis on networking. Responsibilities may vary between organizations, but on-site servers, software-network interactions as well as network integrity/resilience are the key areas of focus.

Downstream Networking refers to data sent from a network service provider to a customer.

Upstream Networking refers to the direction in which data can be transferred from the client to the server (uploading).

Network Operating System is a specialized operating system for a network device such as a router, switch or firewall.
An operating system oriented to computer networking, to allow shared file and printer access among multiple computers in a network, to enable the sharing of data, users, groups, security, applications, and other networking functions. Typically over a local area network (LAN), or private network. This sense is now largely historical, as common operating systems generally now have such features included.

Professional Services Networks are networks of independent firms who come together to cost-effectively provide services to clients through an organized framework.

Social Networks
Collaborations

Value Network Analysis is a methodology for understanding, using, visualizing, optimizing internal and external value networks and complex economic ecosystems. The methods include visualizing sets of relationships from a dynamic whole systems perspective. Robust network analysis approaches are used for understanding value conversion of financial and non-financial assets, such as intellectual capital, into other forms of value.

Value Network is a business analysis perspective that describes social and technical resources within and between businesses. The nodes in a value network represent people (or roles). The nodes are connected by interactions that represent tangible and intangible deliverables. These deliverables take the form of knowledge or other intangibles and/or financial value. Value networks exhibit interdependence. They account for the overall worth of products and services. Companies have both internal and external value networks.

Encapsulation Networking is a method of designing modular communication protocols in which logically separate functions in the network are abstracted from their underlying structures by inclusion or information hiding within higher level objects.

Dynamic Network Analysis is an emergent scientific field that brings together traditional social network analysis (SNA), link analysis (LA), social simulation and multi-agent systems (MAS) within network science and network theory.

Link Aggregation applies to various methods of combining (aggregating) multiple network connections in parallel in order to increase throughput beyond what a single connection could sustain, and to provide redundancy in case one of the links should fail. A Link Aggregation Group (LAG) combines a number of physical ports together to make a single high-bandwidth data path, so as to implement the traffic load sharing among the member ports in the group and to enhance the connection reliability.

Artificial Neural Network (ai)
Matrix (construct)
Cross Linking is a bond that links one polymer chain to another. They can be covalent bonds or ionic bonds.
Virtual Private Network
Internet
Internet Connection Types
Fiber Optics
Search Engines
Levels of Thinking
Information Technology

Network Topology is the arrangement of the various elements (links, nodes, etc.) of a computer network. Essentially, it is the topological structure of a network and may be depicted physically or logically.

Routing is the process of selecting a path for traffic in a network, or between or across multiple networks. Routing is performed for many types of networks, including circuit-switched networks, such as the public switched telephone network (PSTN), computer networks, such as the Internet, as well as in networks used in public and private transportation, such as the system of streets, roads, and highways in national infrastructure.

Asymmetric Digital Subscriber Line is a type of digital subscriber line (DSL) technology, a data communications technology that enables faster data transmission over copper telephone lines rather than a conventional voiceband modem can provide. ADSL differs from the less common symmetric digital subscriber line (SDSL). In ADSL, Bandwidth and bit rate are said to be asymmetric, meaning greater toward the customer premises (downstream) than the reverse (upstream). Providers usually market ADSL as a service for consumers for Internet access for primarily downloading content from the Internet, but not serving content accessed by others. (ADSL).

Cellular Network is a communication network where the last link is wireless. The network is distributed over land areas called cells, each served by at least one fixed-location transceiver, known as a cell site or base station. This base station provides the cell with the network coverage which can be used for transmission of voice, data and others. A cell might use a different set of frequencies from neighboring cells, to avoid interference and provide guaranteed service quality within each cell.

Telephone is a telecommunications device that permits two or more users to conduct a conversation when they are too far apart to be heard directly. A Telephone converts sound, typically and most efficiently the human voice, into electronic signals suitable for transmission via cables or other transmission media over long distances, and replays such signals simultaneously in audible form to its user.

Tin Can Telephone (wiki)
Phone Network

Landline refers to a phone that uses a metal wire or fibre optic telephone line for transmission as distinguished from a mobile cellular line, which uses radio waves for transmission.

Ethernet is a family of computer networking technologies commonly used in local area networks (LAN), metropolitan area networks (MAN) and wide area networks (WAN).

HomePNA is an incorporated non-profit industry association of companies that develops and standardizes technology for home networking over the existing coaxial cables and telephone wiring within homes, so new wires do not need to be installed.

Communication Law is dedicated to the proposition that freedom of speech is relevant and essential to every aspect of the communication discipline.

Communications Act of 1934 as created for the purpose of regulating interstate and foreign commerce in communication by wire and radio so as to make available, so far as possible, to all the people of the United States a rapid, efficient, nationwide, and worldwide wire and radio communication service with adequate facilities at reasonable charges, for the purpose of the national defense, and for the purpose of securing a more effective execution of this policy by centralizing authority theretofore granted by law to several agencies and by granting additional authority with respect to interstate and foreign commerce in wire and radio communication, there is hereby created a commission to be known as the 'Federal Communications Commission', which shall be constituted as hereinafter provided, and which shall execute and enforce the provisions of this Act.

Telecommunications Policy is a framework of law directed by government and the Regulatory Commissions, most notably the Federal Communications Commission.

Communications Protocol is a system of rules that allow two or more entities of a communications system to transmit information via any kind of variation of a physical quantity. These are the rules or standard that defines the syntax, semantics and synchronization of communication and possible error recovery methods. Protocols may be implemented by hardware, software, or a combination of both.

Signal Corps develops, tests, provides, and manages communications and information systems support for the command and control of combined arms forces.

International Communications Law consists primarily of a number of bilateral and multilateral communications treaties.

Outline of Communication (pdf)

Information and Communications Technology is an extended term for information technology (IT) which stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals), computers as well as necessary enterprise software, middleware, storage, and audio-visual systems, which enable users to access, store, transmit, and manipulate information. (ICT)

Unified Communications is a marketing buzzword describing the integration of real-time enterprise communication services such as instant messaging (chat), presence information, voice (including IP telephony), mobility features (including extension mobility and single number reach), audio, web & video conferencing, fixed-mobile convergence (FMC), desktop sharing, data sharing (including web connected electronic interactive whiteboards), call control and speech recognition with non-real-time communication services such as unified messaging (integrated voicemail, e-mail, SMS and fax). UC is not necessarily a single product, but a set of products that provides a consistent unified user interface and user experience across multiple devices and media types. In its broadest sense, UC can encompass all forms of communications that are exchanged via a network to include other forms of communications such as Internet Protocol Television (IPTV) and digital signage Communications as they become an integrated part of the network communications deployment and may be directed as one-to-one communications or broadcast communications from one to many. UC allows an individual to send a message on one medium and receive the same communication on another medium. For example, one can receive a voicemail message and choose to access it through e-mail or a cell phone. If the sender is online according to the presence information and currently accepts calls, the response can be sent immediately through text chat or a video call. Otherwise, it may be sent as a non-real-time message that can be accessed through a variety of media.

Communicating Knowledge



Super Computers


Supercomputer is a computer with a high level of computing performance compared to a general-purpose computer. Performance of a supercomputer is measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS). As of 2015, there are supercomputers which can perform up to quadrillions of FLOPS.

Floating Point Operations Per Second or FLOPS, is a measure of computer performance, useful in fields of scientific computations that require floating-point calculations. For such cases it is a more accurate measure than measuring instructions per second.
Petascale Computing is one quadrillion Floating Point operations per second.
Exascale Computing is a billion billion calculations per second.

Titan is an upgrade of Jaguar, a previous supercomputer at Oak Ridge, that uses graphics processing units (GPUs) in addition to conventional central processing units (CPUs). Titan is the first such hybrid to perform over 10 petaFLOPS.

K Computer meaning 10 quadrillion, is a supercomputer manufactured by Fujitsu which is based on a distributed memory architecture with over 80,000 computer nodes.

Quantum Computer studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states. A quantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer. The field of quantum computing was initiated by the work of Paul Benioff and Yuri Manin in 1980, Richard Feynman in 1982, and David Deutsch in 1985. A quantum computer with spins as quantum bits was also formulated for use as a quantum space–time in 1968.

A transistor stores a single “bit” of information. If the transistor is “on,” it holds a 1, and if it’s “off,” it holds a 0.
“Qubit” can store a Zero's and Ones simultaneously.

Superposition Principle
Quantum Annealing
Qubit Oxford Quantum

The Tianhe-2 is the most powerful supercomputer built to date, demands 24 megawatts of power, while the human brain runs on just 10 watts.

Biological Neuron-Based Computer Chips (wetchips)
Artificial Intelligence

TOP 500 list of the World’s Top Supercomputers

ASC Sequoia will have 1.6 petabytes of memory, 96 racks, 98,304 compute nodes, and 1.6 million cores. Though orders of magnitude more powerful than such predecessor systems as ASC Purple and BlueGene/L, Sequoia will be 160 times more power efficient than Purple and 17 times more than BlueGene/L. Is expected to be one of the most powerful supercomputers in the world, equivalent to the 6.7 billion people on earth using hand calculators and working together on a calculation 24 hours per day, 365 days a year, for 320 years…to do what Sequoia will do in one hour.

DARPA or Defense Advanced Research Projects Agency, is an agency of the U.S. Department of Defense responsible for the development of emerging technologies for use by the military. Darpa

IARPA or Intelligence Advanced Research Projects Activity, invests in high-risk, high-payoff research programs to tackle some of the most difficult challenges of the agencies and disciplines in the Intelligence Community (IC).

Institute for Computational Cosmology is to advance fundamental knowledge in cosmology. Topics of active research include: the nature of dark matter and dark energy, the evolution of cosmic structure, the formation of galaxies, and the determination of fundamental parameters.

Fiber Optics



Virtual PC


Virtual Machine is an emulation of a computer system. Virtual machines are based on computer architectures and provide functionality of a physical computer. Their implementations may involve specialized hardware, software, or a combination. There are different kinds of virtual machines, each with different functions: System virtual machines (also termed full virtualization VMs) provide a substitute for a real machine. They provide functionality needed to execute entire operating systems. A hypervisor uses native execution to share and manage hardware, allowing for multiple environments which are isolated from one another, yet exist on the same physical machine. Modern hypervisors use hardware-assisted virtualization, virtualization-specific hardware, primarily from the host CPUs. Process virtual machines are designed to execute computer programs in a platform-independent environment. Some virtual machines, such as QEMU, are designed to also emulate different architectures and allow execution of software applications and operating systems written for another CPU or architecture. Operating-system-level virtualization allows the resources of a computer to be partitioned via the kernel's support for multiple isolated user space instances, which are usually called containers and may look and feel like real machines to the end users.

Virtual Desktop is a term used with respect to user interfaces, usually within the WIMP paradigm, to describe ways in which the virtual space of a computer's desktop environment is expanded beyond the physical limits of the screen's display area through the use of software. This compensates for a limited desktop area and can also be helpful in reducing clutter. There are two major approaches to expanding the virtual area of the screen. Switchable virtual desktops allow the user to make virtual copies of their desktop view-port and switch between them, with open windows existing on single virtual desktops. Another approach is to expand the size of a single virtual screen beyond the size of the physical viewing device. Typically, scrolling/panning a subsection of the virtual desktop into view is used to navigate an oversized virtual desktop.
v2.0 Desktops allows you to organize your applications on up to four virtual desktops.

Hardware Virtualization is the virtualization of computers as complete hardware platforms, certain logical abstractions of their componentry, or only the functionality required to run various operating systems. Virtualization hides the physical characteristics of a computing platform from the users, presenting instead another abstract computing platform. At its origins, the software that controlled virtualization was called a "control program", but the terms "hypervisor" or "virtual machine monitor" became preferred over time.

Virtualization Software specifically emulators and hypervisors, are software packages that emulate the whole physical computer machine, often providing multiple virtual machines on one physical platform. The table below compares basic information about platform virtualization hypervisors.

Hypervisor is computer software, firmware, or hardware, that creates and runs virtual machines. A computer on which a hypervisor runs one or more virtual machines is called a host machine, and each virtual machine is called a guest machine. The hypervisor presents the guest operating systems with a virtual operating platform and manages the execution of the guest operating systems. Multiple instances of a variety of operating systems may share the virtualized hardware resources: for example, Linux, Windows, and OS X instances can all run on a single physical x86 machine. This contrasts with operating-system-level virtualization, where all instances (usually called containers) must share a single kernel, though the guest operating systems can differ in user space, such as different Linux distributions with the same kernel.

Virtual Box

Sandbox is a security mechanism for separating running programs. It is often used to execute untested or untrusted programs or code, possibly from unverified or untrusted third parties, suppliers, users or websites, without risking harm to the host machine or operating system. A sandbox typically provides a tightly controlled set of resources for guest programs to run in, such as scratch space on disk and memory. Network access, the ability to inspect the host system or read from input devices are usually disallowed or heavily restricted. In the sense of providing a highly controlled environment, sandboxes may be seen as a specific example of virtualization. Sandboxing is frequently used to test unverified programs that may contain a virus or other malicious code, without allowing the software to harm the host device.

Operating System Sandbox: Virtual PC (youtube)
VM Ware

Hyper-V formerly known as Windows Server Virtualization, is a native hypervisor; it can create virtual machines on x86-64 systems running Windows. Starting with Windows 8, Hyper-V supersedes Windows Virtual PC as the hardware virtualization component of the client editions of Windows NT. A server computer running Hyper-V can be configured to expose individual virtual machines to one or more networks.

Parallels

Virtual Private Server is a virtual machine sold as a service by an Internet hosting service. A VPS runs its own copy of an operating system, and customers may have superuser-level access to that operating system instance, so they can install almost any software that runs on that OS. For many purposes they are functionally equivalent to a dedicated physical server, and being software-defined, are able to be much more easily created and configured. They are priced much lower than an equivalent physical server. However, as they share the underlying physical hardware with other VPSs, performance may be lower, depending on the workload of any other executing virtual machines. 

Virtual Private Network enables users to send and receive data across shared or public networks as if their computing devices were directly connected to the private network. Applications running across the VPN may therefore benefit from the functionality, security, and management of the private network.

Artificial Neural Network
Safe Internet Use

Dedicated Hosting Service is a type of Internet hosting in which the client leases an entire server not shared with anyone else. This is more flexible than shared hosting, as organizations have full control over the server(s), including choice of operating system, hardware, etc. There is also another level of dedicated or managed hosting commonly referred to as complex managed hosting. Complex Managed Hosting applies to both physical dedicated servers, Hybrid server and virtual servers, with many companies choosing a hybrid (combination of physical and virtual) hosting solution.

Virtualization refers to the act of creating a Virtual (rather than actual) version of something, including virtual computer hardware platforms, storage devices, and computer network resources.

Windows Virtual PC is a virtualization program for Microsoft Windows. In July 2006 Microsoft released the Windows version as a free product.

Virtual PC
Virtual Reality


Remote PC to PC


Teaching via Video Conference - Remote IT Services

Remote PC to PC Services
Log Me In 
Team Viewer
Go to Assist
Pogo Plug
Dyn DNS
Tight VNC
Web Conferencing

Tutoring

Operating Systems


Operating System is system software that manages computer hardware and software resources and provides common services for computer programs. All computer programs, excluding firmware, require an operating system to function.
Time-sharing operating systems schedule tasks for efficient use of the system and may also include accounting software for cost allocation of processor time, mass storage, printing, and other resources.

Timeline of Operating Systems (wiki)
History of Operating Systems (wiki) 

OS Types
OS Operating Systems (wiki)
Android Operating System (wiki)
Red Hat Linux
Linux (wiki)
GNU
Ubuntu
Server OS
How to Dual Boot Linux on your PC
CloudReady lightweight operating system
Backup Operating System
Substitute Alternate Operating Systems

Human Operating System (HOS)

Server Operating System
A server operating system, also called a server OS, is an Operating System specifically designed to run on servers, which are specialized computers that operate within a client/server architecture to serve the requests of client computers on the network.
The server operating system, or server OS, is the software layer on top of which other software programs, or applications, can run on the server hardware. Server operating systems help enable and facilitate typical server roles such as Web server, mail server, file server, database server, application server and print server.
Popular server operating systems include Windows Server, Mac OS X Server, and variants of Linux such as Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise Server. Server edition of Ubuntu Linux is free.


Open Source


Open-source Software is computer software with its source code made available with a license in which the copyright holder provides the rights to study, change, and distribute the software to anyone and for any purpose. Open-source software may be developed in a collaborative public manner. According to scientists who studied it, open-source software is a prominent example of open collaboration.

Open Source is a decentralized development model that encourages open collaboration. A main principle of open-source software development is peer production, with products such as source code, blueprints, and documentation freely available to the public. The open-source movement in software began as a response to the limitations of proprietary code. The model is used for projects such as in open-source appropriate technologies, and open-source drug discovery.

Business Software Tools and Apps
Open Source Software
Open Source Education
Open Source Initiative 
Open Source

Asterisk open source framework for building communications applications
Alfresco software built on open standards

Open-Source Electronics
Arduino
Raspberry Pi
Mmassimo Banzi (video)
Arduino 3D Printer
Science Kits
Open Source Hardware
Freeware Files

Computer Rentals


Rent Solutions
Vernon Computer Source
Smart Source Rentals
Google Cromebook

Miles Technologies Technology Solutions

Maximum PC
Knowledge Management
Artificial Intelligence
Science
Ideas
Innovation

Word Processors


Open Office Suite
Libre Office
Abi Source
Word Processors List (PDF)
Google Docs
Google Business Tools and Apps
Zoho
Photo Editing Software

Scraper Wiki getting data from the web, spreadsheets, PDFs.

Comet Docs Convert, Store and Share your documents.

Computer Courses


W3 Schools
Webmaster Tutorials
Technology Terms
Creator Academy by Google
J Learning
Lynda
Compucert
Learning Tree
IT Training
Building a Search Engine

More Online Schools

Learn to Code


Apps


Application Program or APP for short, is a computer program designed to perform a group of coordinated functions, tasks, or activities for the benefit of the user.

Application Software is a computer program designed to perform a group of coordinated functions, tasks, or activities for the benefit of the user. Examples of an application include a word processor, a spreadsheet, an accounting application, a web browser, a media player, an aeronautical flight simulator, a console game or a photo editor. The collective noun application software refers to all applications collectively. This contrasts with system software, which is mainly involved with running the computer. Applications may be bundled with the computer and its system software or published separately, and may be coded as proprietary, open-source or university projects. Apps built for mobile platforms are called mobile apps.

Authoring System is a program that has pre-programmed elements for the development of interactive multimedia software titles. Authoring systems can be defined as software that allows its user to create multimedia applications for manipulating multimedia objects.

Application Performance Management is the monitoring and management of performance and availability of software applications. APM strives to detect and diagnose complex application performance problems to maintain an expected level of service. APM is "the translation of IT metrics into business meaning ([i.e.] value).

User Testing from concept to launch, User Testing provides actionable insights enabling you to create great experiences.

Validately recruit testers, launch tests, and analyze results.

Lookback designer & research.

Prototypes (engineering)

Applications Interface

Create Mobile Apps

Phone Gap
Como
Sweb Apps
App Breeder
My App Builder
I Build App
Mobile Roadie
Yapp
App Makr 
Best App Makers
Build Your Own Business Apps in 3 Minutes
Gigster building your app
Google Developer Apps
Thing Space Verizon
App Management Interface
AppCenter: The Pay-What-You-Want App Store
Health Medical Apps
Apps from Amazon
Car Finder App
Visual Travel Tours
Audio Travel
Gate Guru App
App Brain
Trip It
Field Tripper App
Test Flight App
App Shopper
Red Laser
Portable Apps
I-nigma Bar Code Reader
More Apps
M-Pesa
Language Translators
Wikitude
Yellow Pages App
Portable Apps
What's App
Apps for Plant Lovers
Press Pad App for Digital Magazines and Publishers
Tech Fetch
Travel Tools
Cell Phones & Tools
Next Juggernaut
Rethink DB
Big in Japan
Near by Now
The Find
Milo
Apple
X Code
Quixey
Just in Mind

HyperCard is application software and a programming tool for Apple Macintosh and Apple IIGS computers. It is among the first successful hypermedia systems before the World Wide Web. It combines database abilities with a graphical, flexible, user-modifiable interface. HyperCard also features HyperTalk, a programming language for manipulating data and the user interface.

Enable Cognitive Computing Features In Your App Using IBM Watson's Language, Vision, Speech and Data APIs


Computer Maintenance


Computer Hope
How to Geek
Stack Overflow
PC Mag
Data Doctors
Repairs 4 Laptop
Maintain Your Computer (wiki how)
PC User
Maintain PC (ehow)
Open Source Ultra Defrager
Data Recovery Software
Dmoz Computers Websites
Inter-Hacktive

Hackerspace
Technology Tools
Math
Games
Information Management
Computer History
Laptops for Learning
Flash Drive Knowledge
Engineering Design
Technology News

Self-Help Computer Resources
Thanks to the millions of people sharing their knowledge and experiences online, you can pretty much learn anything you want on your own.  So over the years I have collected some great resources that come in handy. Sharing is awesome!
Information Sources

Surfing the Internet Tips
First a Little Warning: When visiting other websites be very careful what you click on because some software downloads are very dangerous to your computer, so be absolutely sure what you are downloading. Read the ".exe" file name. Search the internet for more info, or to verify '.exe' executable files. It's a good idea to always get a second opinion on what software you might need.

Free Virus Protection
Internet Browsers
Internet Safety Info
Internet Connections

Computer Quick Fix Tips
Make sure that your Computer System Restore is on. This can sometimes be used to fix a bad computer virus or malfunction. It's a good idea to do a System Restore and a Virus Scan in the Safe Mode (During Computer Restart hit the F8 Key and then follow the instructions)  (F2 is Setup and F12 is the Boot Menu) Warning: System Restore that is found under Start/Programs/Accessories/System Tools is not the same as PC Restore, Factory Settings or Image Restore, which will delete all your personal files and software from the PC. If you don't have OS Setup Disk that came with your PC then sometimes the PC will have a Factory Settings copy installed. This you need to do while your PC is rebooting. Press ' Ctrl '  then press F11 and then release both at the same time. You should see something like Symantec Ghost where you will be prompted to reinstall Factory Settings. This will delete all your personal files and software from the PC so please back up first.

Always Have your Operating System Restore Disk or Recovery Disc handy because not all computer problems can be fixed. You also need your Drivers and Applications Disks too. Always backup your most important computer files because reinstalling the operating system will clear your data.

Kingston DataTraveler 200 - 128 GB USB 2.0 Flash Drive DT200/128GB (Black)
Western Digital 2 TB USB 2.0 Desktop External Hard Drive

Sending Large Files
Bit Torrent Protocol (wiki)
Lime Wire P2P
Send Large Files
Zip Files
Stuffit File Compression
File-Zilla
Dropbox
File Sharing for Professionals
We Transfer

You can try some of these free programs to help keep your computer safe: (might be outdated)
Lava Soft Ad-Ware
Spybot Search & Destroy
CCleaner
Malwarebytes
Hijack This
Spyware Blaster

Download.com has the software above but be very careful not to click on the wrong download item.
Please Verify the correct ".exe file." name.

Free Software ?
As the saying goes "Nothing is Free." Free software sometimes comes loaded with other software programs that you don't need. So always check or uncheck the appropriate boxes, and read everything carefully. But even then, they might sneak unwanted programs by you, so you will have to remove those programs manually. With the internet, dangers are always lurking around the corner, so please be careful, be aware and educate yourself. When our computer systems and the internet are running smoothly the beauty of this machine becomes very evident. This is the largest collaboration of people in human history. With so many contributors from all over the world, we now have more knowledge and information at our fingertips then ever before, our potential is limitless. 

Free Software Info
Free Software Foundation
General Public License
Free BSD
Jolla
Hadoop Apache
Open Mind
Software Geek

New Computers
Sadly new PC's are loaded with a lot of bogus software and programs that you don't need. Removing them can be a challenge, but it's absolutely necessary if you want your PC to run smoothly without all those annoying distractions that slow your PC down.

Lojack For Laptops (amazon)

Tired and Disgusted with Windows 8 dysfunctional Operating System Interface, Download Classic Shell to make your computer like XP, and find things again, or you can just update your windows 8.0 to windows 8.1,, because 8.1 is definitely better then 8.0, but still not perfect yet.

Oasis Websoft advanced software by providing superior solutions for web applications, web sites and enterprise software. We are committed to building infrastructure that will ensure that the West African sub-region is not left behind in the continuous evolution of information technology.

Fawn fast, scalable, and energy-efficient cluster architecture for data-intensive computing.

BlueStacks is currently the best way to run Android apps on Windows. It doesn’t replace your entire operating system. Instead, it runs Android apps within a window on your Windows desktop. This allows you to use Android apps just like any other program.

Utility Software is system software designed to help analyze, configure, optimize or maintain a computer. Utility software, along with operating system software, is a type of system software used to support the computer infrastructure, distinguishing it from application software which is aimed at directly performing tasks that benefit ordinary users.

Service-Oriented Architecture is an architectural pattern in computer software design in which application components provide services to other components via a communications protocol, typically over a network. The principles of service-orientation are independent of any vendor, product or technology. A service is a self-contained unit of functionality, such as retrieving an online bank statement. By that definition, a service is an operation that may be discretely invoked. However, in the Web Services Description Language (WSDL), a service is an interface definition that may list several discrete services/operations. And elsewhere, the term service is used for a component that is encapsulated behind an interface. This widespread ambiguity is reflected in what follows. Services can be combined to provide the functionality of a large software application. SOA makes it easier for software components on computers connected over a network to cooperate. Every computer can run any number of services, and each service is built in a way that ensures that the service can exchange information with any other service in the network without human interaction and without the need to make changes to the underlying program itself. A paradigm for organizing and utilizing distributed capabilities that may be under the control of different ownership domains. It provides a uniform means to offer, discover, interact with and use capabilities to produce desired effects consistent with measurable preconditions and expectations.



The First Integrated Circuit  (September 12th, 1958)

The First Microchip Handmade in 1958 by Jack Kilby
And now almost 60 years later...

Integrated Circuit is a set of electronic circuits on one small flat piece (or "chip") of semiconductor material, normally silicon. The integration of large numbers of tiny transistors into a small chip resulted in circuits that are orders of magnitude smaller, cheaper, and faster than those constructed of discrete electronic components.

Integrated Circuit Layout is the representation of an integrated circuit in terms of planar geometric shapes which correspond to the patterns of metal, oxide, or semiconductor layers that make up the components of the integrated circuit. When using a standard process—where the interaction of the many chemical, thermal, and photographic variables is known and carefully controlled—the behaviour of the final integrated circuit depends largely on the positions and interconnections of the geometric shapes. Using a computer-aided layout tool, the layout engineer—or layout technician—places and connects all of the components that make up the chip such that they meet certain criteria—typically: performance, size, density, and manufacturability. This practice is often subdivided between two primary layout disciplines: Analog and digital. The generated layout must pass a series of checks in a process known as physical verification. The most common checks in this verification process are design rule checking (DRC), layout versus schematic (LVS), parasitic extraction, antenna rule checking, and electrical rule checking (ERC). When all verification is complete, the data is translated into an industry-standard format, typically GDSII, and sent to a semiconductor foundry. The process of sending this data to the foundry is called tapeout because the data used to be shipped out on a magnetic tape. The foundry converts the data into another format and uses it to generate the photomasks used in a photolithographic process of semiconductor device fabrication. In the earlier, simpler, days of IC design, layout was done by hand using opaque tapes and films, much like the early days of printed circuit board (PCB) design. Modern IC layout is done with the aid of IC layout editor software, mostly automatically using EDA tools, including place and route tools or schematic-driven layout tools. The manual operation of choosing and positioning the geometric shapes is informally known as "polygon pushing".

World's first 1,000-Processor Chip A microchip containing 1,000 independent programmable processors has been designed. The energy-efficient 'KiloCore' chip has a maximum computation rate of 1.78 trillion instructions per second and contains 621 million transistors. The highest clock-rate processor ever designed.

Atomically Thin Transistors that is Two-Dimensional
Berkeley Lab-led research breaks major barrier with the Smallest Transistor Ever by creating gate only 1 nanometer long. High-end 20-nanometer-gate transistors now on the market.
Molybdenum Disulfide

Chip-sized, high-speed terahertz modulator raises possibility of faster data transmission

Computers Made of Genetic Material? HZDR researchers conduct electricity using DNA-based nanowires.
Semiconductor-free microelectronics are now possible, thanks to metamaterials
Metamaterial is a material engineered to have a property that is not found in nature.
Semiconductor-free microelectronics (youtube)

2D materials that could make devices faster, smaller, and efficient nanomaterials that are only a few atoms in thickness.
Polaritons in layered two-dimensional materials

Researchers pave the way for Ionotronic Nanodevices. Discovery helps develop new kinds of electrically switchable memories. Ionotronic devices rely on charge effects based on ions, instead of electrons or in addition to electrons.

Carbon Nanotube Transistors Outperform Silicon, for first time.

Engineers use Graphene as a “copy machine” to produce cheaper Semiconductor Wafers. In 2016, annual global semiconductor sales reached their highest-ever point, at $339 billion worldwide. In that same year, the semiconductor industry spent about $7.2 billion worldwide on wafers that serve as the substrates for microelectronics components, which can be turned into transistors, light-emitting diodes, and other electronic and photonic devices. MIT engineers may vastly reduce the overall cost of wafer technology and enable devices made from more exotic, higher-performing semiconductor materials than conventional silicon. Uses graphene -- single-atom-thin sheets of graphite -- as a sort of "copy machine" to transfer intricate crystalline patterns from an underlying semiconductor wafer to a top layer of identical material.

Neuromorphic Engineering also known as neuromorphic computing, is a concept describing the use of very-large-scale integration (VLSI) systems containing electronic analog circuits to mimic neuro-biological architectures present in the nervous system. Very-Large-Scale Integration is the current level of computer microchip miniaturization and refers to microchips containing in the hundreds of thousands of transistors. LSI (large-scale integration) meant microchips containing thousands of transistors. Earlier, MSI (medium-scale integration) meant a microchip containing hundreds of transistors and SSI (small-scale integration) meant transistors in the tens.

Reconfigurable Chaos-Based Microchips Offer Possible Solution to Moore’s Law. Nonlinear, chaos-based integrated circuits that enable computer chips to perform multiple functions with fewer transistors. The transistor circuit can be programmed to implement different instructions by morphing between different operations and functions. The potential of 100 morphable nonlinear chaos-based circuits doing work equivalent to 100 thousand circuits, or of 100 million transistors doing work equivalent to three billion transistors holds promise for extending Moore’s law.

Redox-Based Resistive Switching Random Access Memory (ReRAM)
A team of international scientists have found a way to make memory chips perform computing tasks, which is traditionally done by computer processors like those made by Intel and Qualcomm. Currently, all computer processors in the market are using the binary system, which is composed of two states -- either 0 or 1. For example, the letter A will be processed and stored as 01000001, an 8-bit character. However, the prototype ReRAM circuit built by Asst Prof Chattopadhyay and his collaborators
processes data in four states instead of two. For example, it can store and process data as 0, 1, 2, or 3, known as Ternary number system. Because ReRAM uses different electrical resistance to store information, it could be possible to store the data in an even higher number of states, hence speeding up computing tasks beyond current limitations current computer systems, all information has to be translated into a string of zeros and ones before it can be processed.

Parallel Computing: 18-core credit card sized computer

Memristor or memory resistor, is a hypothetical non-linear passive two-terminal electrical component relating electric charge and magnetic flux linkage. According to the characterizing mathematical relations, the memristor would hypothetically operate in the following way: The memristor's electrical resistance is not constant but depends on the history of current that had previously flowed through the device, i.e., its present resistance depends on how much electric charge has flowed in what direction through it in the past; the device remembers its history — the so-called non-volatility property. When the electric power supply is turned off, the memristor remembers its most recent resistance until it is turned on again.

Illinois team advances GaN-on-Silicon technology towards scalable high electron mobility transistors

Small tilt in Magnets makes them viable Memory Chips - Nano Technology

T-rays will “speed up” computer memory by a factor of 1,000

Germanium Tin Laser Could Increase Processing Speed of Computer Chips

Silicon Photonics is the study and application of photonic systems which use silicon as an optical medium. The silicon is usually patterned with sub-micrometre precision, into microphotonic components. These operate in the infrared, most commonly at the 1.55 micrometre wavelength used by most fiber optic telecommunication systems. The silicon typically lies on top of a layer of silica in what (by analogy with a similar construction in microelectronics) is known as silicon on insulator (SOI).

Silicon Carbide is a compound of silicon and carbon with chemical formula SiC. It occurs in nature as the extremely rare mineral moissanite. Synthetic silicon carbide powder has been mass-produced since 1893 for use as an abrasive. Grains of silicon carbide can be bonded together by sintering to form very hard ceramics that are widely used in applications requiring high endurance, such as car brakes, car clutches and ceramic plates in bulletproof vests. Electronic applications of silicon carbide such as light-emitting diodes (LEDs) and detectors in early radios were first demonstrated around 1907. SiC is used in semiconductor electronics devices that operate at high temperatures or high voltages, or both. Large single crystals of silicon carbide can be grown by the Lely method; they can be cut into gems known as synthetic moissanite. Silicon carbide with high surface area can be produced from SiO2 contained in plant material.

ORNL Researchers Break Data Transfer Efficiency Record transfer of information via superdense coding, a process by which the properties of particles like photons, protons and electrons are used to store as much information as possible.

Superdense Coding is a technique used to send two bits of classical information using only one qubit, which is a unit of quantum information.

Quantum Computing makes direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data are encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states.

A Single Atom can store one bit of binary information
When the holmium atoms were placed on a special surface made of magnesium oxide, they naturally oriented themselves with a magnetic north and south pole—just like regular magnets have—pointing either straight up or down, and remained that way in a stable condition. What’s more, they could make the atoms flip by giving them a zap with a scanning tunneling microscope that has a needle with a tip just one atom wide. Orientation conveys binary information—either a one or a zero. Experiment shows that they could store one bit of information in just one atom. If this kind of technology could be scaled up, it theoretically could hold 80,000 gigabytes of information in just a square inch. A credit-card-size device could hold 35 million songs. Atoms could be placed within just about a nanometer of each other without interfering with their neighbors, meaning they could be packed densely. This tech won't show up in your smartphone anytime soon. For starters, the experiment required a very, very chilly temperature: 1 degree Kelvin, which is colder than -450 Fahrenheit. That's pretty energy intensive, and not exactly practical in most data storage settings.


Computer Chip Close-up Macro Photo
Computer Chip Closeup Macro Photo

The moment you turn on your pc, what you see is the work of thousands and thousands of people, educated in the fields of engineering, science, math and physics, just to name a few. And that's just the software. The hardware also took the work of thousands of skilled people. Covering many different industries, which adds the work of thousands of more people. I'm also a product that took millions of people over thousands of years to make, just to get me here in this moment in time.

Computer Industry is the range of businesses involved in designing computer hardware and computer networking infrastructures, developing computer software, manufacturing computer components, and providing information technology (IT) services. Software Industry includes businesses for development, maintenance and publication of software that are using different business models, also includes software services, such as training, documentation, consulting and data recovery.



The First Computer

Antikythera Mechanism is 2,100-year-old ancient analog computer. An international team of scientists has now read about 3,500 characters of explanatory text -- a quarter of the original -- in the innards of the 2,100-year-old remains.

Antikythera Mechanism

Analog Computer is a form of computer that uses the continuously changeable aspects of physical phenomena such as Electrical Network, Mechanics, or Hydraulics quantities to model the problem being solved. Digital computers represent varying quantities symbolically, as their numerical values change. As an analog computer does not use discrete values, but rather continuous values, processes cannot be reliably repeated with exact equivalence, as they can with Turing machines. Unlike digital signal processing, analog computers do not suffer from the quantization noise, but are limited by analog noise.

Turing Machine is an abstract machine that manipulates symbols on a strip of tape according to a table of rules; to be more exact, it is a mathematical model of computation that defines such a device. Despite the model's simplicity, given any computer algorithm, a Turing machine can be constructed that is capable of simulating that algorithm's logic.

Curta is a small mechanical calculator developed by Curt Herzstark in the 1930s in Vienna, Austria. By 1938, he had filed a key patent, covering his complemented stepped drum, Deutsches Reichspatent (German National Patent) No. 747073. This single drum replaced the multiple drums, typically around 10 or so, of contemporary calculators, and it enabled not only addition, but subtraction through nines complement math, essentially subtracting by adding. The nines' complement math breakthrough eliminated the significant mechanical complexity created when "borrowing" during subtraction. This drum would prove to be the key to the small, hand-held mechanical calculator the Curta would become. Curtas were considered the best portable calculators available until they were displaced by electronic calculators in the 1970s.

Abstract Machine also called an abstract computer, is a theoretical model of a computer hardware or software system used in automata theory. Abstraction of computing processes is used in both the computer science and computer engineering disciplines and usually assumes a discrete time paradigm.
Autonomous Machines

Computer Programming in the Punched Card Era was the invention of computer programming languages up to the mid-1980s, many if not most computer programmers created, edited and stored their programs line by line on punched cards. The practice was nearly universal with IBM computers in the era. A punched card is a flexible write-once medium that encodes data, most commonly 80 characters. Groups or "decks" of cards form programs and collections of data. Users could create cards using a desk-sized keypunch with a typewriter-like keyboard. A typing error generally necessitated repunching an entire card. In some companies, programmers wrote information on special forms called coding sheets, taking care to distinguish the digit zero from the letter O, the digit one from the letter I, eight from B, two from Z, and so on. These forms were then converted to cards by keypunch operators, and in some cases, checked by verifiers. The editing of programs was facilitated by reorganizing the cards, and removing or replacing the lines that had changed; programs were backed up by duplicating the deck, or writing it to magnetic tape.

Keypunch is a device for precisely punching holes into stiff paper cards at specific locations as determined by keys struck by a human operator.

Punched Card is a piece of stiff paper that can be used to contain digital information represented by the presence or absence of holes in predefined positions. The information might be data for data processing applications or, in earlier examples, used to directly control automated machinery. The terms IBM card, or Hollerith card specifically refer to punched cards used in semiautomatic data processing. Punched cards were widely used through much of the 20th century in what became known as the data processing industry, where specialized and increasingly complex unit record machines, organized into data processing systems, used punched cards for data input, output, and storage. Many early digital computers used punched cards, often prepared using keypunch machines, as the primary medium for input of both computer programs and data. While punched cards are now obsolete as a recording medium, as of 2012, some voting machines still use punched cards to record votes.

The First Apple Computer (1976)

First Apple Computer 1976

Monochrome Monitor or Green screen was the common name for a monochrome monitor using a green "P1" phosphor screen. CRT computer monitor which was very common in the early days of computing, from the 1960s through the 1980s, before color monitors became popular. Monochrome monitors have only one color of phosphor (mono means "one", and chrome means "color"). Pixel for pixel, monochrome monitors produce sharper text and images than color CRT monitors. This is because a monochrome monitor is made up of a continuous coating of phosphor and the sharpness can be controlled by focusing the electron beam; whereas on a color monitor, each pixel is made up of three phosphor dots (one red, one blue, one green) separated by a mask. Monochrome monitors were used in almost all dumb terminals and are still widely used in text-based applications such as computerized cash registers and point of sale systems because of their superior sharpness and enhanced readability.

1983 Compaq came out with a portable computer, which sold over 50,000 in the first year, while IBM sold 750,000 pc's that same year. In a few short years, Compaq became a billion dollar company. IBM tried to make Intel sell a new chip only to them but Intel refused so they could sell their chips to more companies. Intel 80386 a 32-bit microprocessor introduced in 1985 had 275,000 transistors. In May 2006, Intel announced that 80386 production would stop at the end of September 2007.
Embedded System is a computer system with a dedicated function within a larger mechanical or electrical system, often
with real-time computing constraints. It is embedded as part of a complete device often including hardware and mechanical parts. Embedded systems control many devices in common use today. Ninety-eight percent of all microprocessors are manufactured as components of embedded systems.
Then the Extended Industry Standard Architecture was announced in September 1988 by a consortium of PC clone vendors (the "Gang of Nine") as a counter to IBM's use of its proprietary Micro Channel architecture (MCA) in its PS/2 series and end IBM's monopoly. But that only gave rise to Microsoft's Monopoly. Exclusive Right
But by 1991, things got worse for Compaq. Other computer companies came into the market, and IBM's patent trolls attacked Compaq. In 2002 Compaq merged with Hewlett Packard.


Organic computers are coming Scientists found a molecule that will help to make organic electronic devices.

Radialene are alicyclic organic compounds containing n cross-conjugated exocyclic double bonds.

Worlds Smallest Computer Michigan Micro Mote (M3)  

Histories Greatest Inventions  

Computer History Chart

Computer History Films



The Thinker Man