What is computer ? Mobile computer,Tablet computer , smart phone, wearble computer, type

 1. Intro of computer


Computers are devices that perform a specific task.
A computer is a device that performs a specific task, but it does so much more than just read, write or calculate (although all three are useful in certain contexts).
A computer is also a way of thinking about the world and what to do with it. A computer is a tool to help you think about the world around you, something you’re now doing by reading this sentence, something that helps you form your own ideas and thoughts, something that helps you formulate your own questions and problems
computer                
Not only will computers have the power to help people solve complicated problems that have never been solved before (such as climate change and how to feed 9 billion people by 2050), computers will be able to help people do things they didn’t ever think possible before: such as generating new ideas on how to solve old problems.

2. Computer Types

Computers are all that matters to you. You have, or are working towards building, a computer. You have a particular kind of computer in mind and want it to be different from others. You’re not interested in solving any particular problem with a specific computer, but rather in finding an easy way to do something which solves the category of problem you care about most (for example, web app development).
Types of computer

There are two schools of thought on what it means to be “different”: the first is that the difference between the two types of computers is just the “software” they use, i.e., it has nothing to do with their hardware or other characteristics. The second position holds that different computers would be more powerful or faster if they used different software, but even then they wouldn’t be distinct from each other. What is computre? Computers are very simple machines: They consist mainly of arithmetic logic units (ALUs) and arithmetic code (the arithmetic code is simply the product of all the bits in an ALU). Simple machines can perform very complicated tasks quite easily (i.e., you can add one bit at a time), where as for simpler machines, this task becomes progressively more difficult and ultimately impossible for them (i.e., an addition takes up a finite number of ALUs). Many people see computers as simple machines: They take bits and perform arithmetic operations on them just like one might do with ordinary numbers like 0 and 1. This is true if you only focus on binary-coded decimal integers, but what about non-binary integers? Are we still talking about ALUs when we talk about non-binary integers?
The answer depends on what exactly we mean by “computation”: A computation consists of performing some algorithm on data which produces output values. This can be done with any data source (it doesn’t have to be numeric) and any format; though most certainly there will be certain benefits when programming in binary-coded decimal or hexadecimal format over another format such as text or XML data formats for example
Computer 

What does this mean for programming? When we say algortithms we don’t mean rules for performing calculations using a specific algorithm; rather we mean algorithms themselves that produce output values from input values based on some set of rules — which rules could vary depending on what kind of algorithm you want to use It should also be noted that not

3. Mobile Computers

Computers are not like other computers. They don’t come with manuals. You have to learn how to use them. So, when you buy a computer, you also get a user manual. It has instructions for how to use it, and some basic information about what it is and which parts are compatible with which software.
The user manual always includes two categories: "how-to" (this is the stuff that took me a while to figure out) and "everything else". How-to instructions usually include a step by step process with examples. Everything else usually includes the following:
                      Mobile computer
     • For example, in Windows: "To install this program, you need to put the disc into your bootable disc drive opening the 'Computer' or 'Computer1' folder."
• For example, in Mac OS X: "Start up Disk Utility on your computer."
The following pictures show what this looks like on different operating systems:
One of the most important things that users take away from using computers is that they need to learn how to use them. The following illustrations show how users can learn how to use Windows (this is more important than macOS).
In Linux and Mac OS X, users do not need to download any software and can boot straight into the operating system without having installed anything (like Windows). This is because these operating systems are very similar; we call this “supplemental OS” functionality . People who are familiar with Windows will be familiar with this functionality but it may take some time for Linux or Mac OS X users getting used to it.This means that as soon as a user starts using Computre he will be able reliably work on his documents everywhere — in his home office or at work or anywhere he chooses — without needing to plug in his PC or Mac device anywhere near him (like at an airport coffee shop). People will start using Computre where they always have been using computers before they even had a device like an iPhone or iPad (which do not have this capability).The following pictures show what users can expect with Computre when they build their first application using Electron-based applications :
The above picture shows what web browsers do when they start up:
One of Computre’s benefits is that its software runs 100% locally on your own computer/device without needing internet access during development or deployment. This means that no matter where anyone works it will continue functioning normally regardless of connectivity issues.

4. Tablet Computers

Computers started out as things you might find in your attic. It wasn’t until they were super fast and cheap that they entered the mainstream. And, if you’ve spent any time browsing through the Mac App Store, you know that Apple, who bought their company more than 50 years ago, has not been able to keep up with the pace of development.
Applications like iMessage and iWork were conceived in the head of Steve Jobs (whose watchwords were “radical openness” and “open-source”). MacOS X was designed by a team of engineers at Pixar who had no idea about software development or how to write code. And, in spite of all this, Apple still sells more Macs than any other computer maker combined.
So why are we so obsessed with tablets?
Tablet computer
One reason is that a small group of companies have dominated them for the past decade: Amazon (which is already in over 130 countries), Microsoft (which is still trying to catch up), and Google (which hasn’t even made a tablet yet). The companies which dominate them are also pretty much all based in Silicon Valley: Amazon because they have enough cash to do whatever they want and Microsoft because they have enough money to pay people to come work for them (and only them) instead of building hardware. The primary thing which differentiates Google from Amazon or Microsoft is not just money — it’s software. But none of these companies uses hardware as a #1 selling point: it just doesn’t make sense for them to do so. After all, when your hardware isn’t great enough to make sense selling anyway, why spend money on it? You can either design your own hardware or buy other people’s hardware — which means you get less control over it than if you own your own hardware instead — but neither option makes much sense from a business perspective..
The other reason technology has exploded into such prominence is because everyone wants an iPhone: everyone wants an Android phone but there aren’t many choice out there; everyone wants a Windows Phone but there aren’t many choice out there; everyone wants an iPad but there aren’t many choice out there; everyone wants a piece of Linux but there aren’t many choice… You could say that computers are now one step closer than ever before to being ubiquitous — with so many choices available, people will always want something else too!

5. Smartphones

Computres are the quintessential Internet of Things (IoT) device and a ubiquitous way to communicate with it. This is true for all sorts of devices, but smartphones are especially interesting for two reasons:
• They’re already so widely used that there’s little need to change them.
• They’re one of the few devices that can execute complex tasks, thanks to their powerful hardware.
So, in the next few posts we will take a look at how computational devices work and how people use them. We start with an introduction to
Smart phone

computational devices and then move on to a close look at smartphones (including how they work). Then we move on to computers and then finally tablets.

6. Wearable computers

Computers are like the human body: they have a lot of complicated internal components, and the relationships between them are hard to understand and hard to control. In other words, computers are very, very complicated. A computer is only as good as its weakest link. For example, if you’re in charge of building a screen for your computer so that it looks nice when it’s on, you’ll want to make sure that your code isn’t using up all the CPU time on the screen even when it doesn’t need to be displaying anything. You do this by making sure that nothing else is drawing on the CPU while you’re running code.
Wearable computer

Batteries get charged up by being used but there is still some energy lost through heat generation (which can be reduced with cooling). Heat also comes from presenters (like laptops and tablets). If you want to use your computer without getting hot or having any heat generated by it at all, then you should think about moving it away from people who might be touching or talking too much (like children), and from places where there might be a lot of ambient noise (like office buildings).
Voice recognition software works by recognizing sounds — usually voice over IP (VoIP) calls — and translating them into text for later processing by the software itself. The more difficult part involves not just recognizing voices but also learning how they sound in different environments: how different pitch patterns change depending on what is being said, how different voices change pitch depending on who is saying it etc. It takes many hours of practice for voice recognition software to become good enough for most purposes.
*This is just one example of several aspects of computer technology which require practice before becoming good enough for most purposes; there are many others which will require even longer periods of “hardening-up” before they become good enough for most purposes or even some applications altogether. We may never completely master them all; we will definitely never completely master everything about computers either — but we do have a decent chance at mastering those aspects which are most important to us (and which we can see clearly). As such, if the future requires acquiring skills in these areas then I believe we should invest time in understanding them now so that we have a sufficiently strong foundation when we need those skills later on.


7. Distributed computing

Computers have been around since the dawn of man, and large-scale computers have been commonplace for many decades. But what is computre? And how does a distributed network work?
Distributeb computing
        
What is computre? It’s an abstraction for a computationally-intensive task, such as solving a particularly hard mathematical problem.
Let’s imagine we have a math problem that is so hard that it is only possible to solve using a “supercomputer” (which we probably won’t build; but maybe someday some do). A typical supercomputer would look something like this:
It would be built with high-end hardware and software, with powerful accelerators, clusters of processors and other advanced technologies. The goal here is to run the problem at full speed, with all its complexity running in parallel on thousands of processors.
This is what computer scientists call a “distributed computing environment” (DCE), where computation can be done by multiple computers at once.
A DCE consists of several modules (sometimes called “layers”), each of which performs a different submodular function:

Each layer implements the computational domain (the set of things that you are trying to solve) in parallel and then communicates with other layers to coordinate its actions. Each layer contains state for the computational domain defined by its submodular function, which maintains its metadata about how well that function has performed over time. This metadata specifies various parameters about the computational domain such as amount of time it has spent on each part, and whether there are any errors or deadlocks among them (a common kind of error). Each layer also keeps track of how much time it has spent on each part so that other layers can calculate their next steps when they get back from their own phase. The layers communicate their progress by sending messages over network connections between them, which keep track of progress through changes in metadata stored in local memory so that they can coordinate their actions if necessary. There may also be an implicit communication layer between layers where they update each other by sending messages to one another instead of directly receiving messages from other layers. In this way distributed computing works at multiple levels: there is an explicit communication layer (layer 2) between modules within one module; there are communications between modules across different modules; and there are communications between two modules within one module. Finally there may also be some kind of weak or even no communication system for this because networks provide excellent means


8. Supercomputers

Computers are awesome. I mean, look at them! But in order to really be awesome, you need to know what they are and why they are awesome:
The most important thing about computers is their data storage. You need to think about that and not just the way the computer works.
This is what you do if you want your code on my computer or my phone:
                            super computer
You put your code there, no matter how big or small it is. It goes on the computer's hard drive, in an area that's called "memory", where it can be accessed by other programs while they run.
That's what programmers call memory - and that means that either your program gets loaded into memory when you run it, or when you save a file, or when you encode something as a video file. If your program runs in one place and then gets unloaded into another place, it loses all its state information - which means all of its code gets lost forever. That's not good.
You can't save this program just anywhere — because if any of the places where your program runs get shut down for any reason (like for maintenance), then things will get out of sync between the places where your machine can see everything and where it's storing it. For example, let's say my computer is running Windows 10 (the operating system from Microsoft). If I want to use my MacBook Pro as a sort of backup for Windows 10, then I have to put Windows 10 onto my MacBook Pro too — because any time I shut down one of those two machines (my MacBook Pro or Windows 10) the other machine starts up without having read anything from its backup copy of the old copy of Windows 10 on my MacBook Pro! And even worse: if I turn off my server somewhere else and leave all of my files in place but just tell some other people what to do with them instead? They're going to forget all about them! So...
So there are several ways we could go about solving this problem: we could make storing code centralised so that nobody has a problem keeping their data synced up with everyone else; we could make storing data local so that we don't lose our history; we could make storing data central—but only allowing needed areas to be stored locally; and so on. And each solution has its pros and cons: local storage is cheaper than centralized storage; local storage doesn't require much disk space; local storage isn't vulnerable to loss due
Super computer


9. Future of computer types essay

Computers are evolving from general purpose machines to specialized ones. Going back over the last few decades, many of the advances we currently take for granted such as word processing, spreadsheets, and web browsing were invented for the specialized domain of personal computers. The same is true today. The growing sophistication and complexity of those systems – with an ever larger network of peripherals – has led to a new era in which computing has become much more specialized than it was even 30 years ago.
As we’ve talked about in previous posts, computer types have existed for a long time: monolithic machines with centralized memory (think “big iron”); large fast processors (think “big iron”); desktop/laptop/notebook computers (think “Laptop”); and mobile devices (think smartphones). These types traditionally offered different technical capabilities in different markets: monolithic machines were powerful and capable in the largest markets; large fast processors were powerful and capable in large markets; desktops/laptops were powerful with the fastest processors and storage capacity; mobile devices provided power, storage capacity, portability, connectivity, etc.
Over time, these distinctions increasingly blurred, leading to a situation where there are no clear distinctions between different market segments — tools that offer all these capabilities are becoming common place across hardware platforms as well as software applications.
One thing that seems likely is that as these categories continue to blur further — technology often helps this happen via an increasing amount of interoperability between platforms so users can use nearly any device they can get their hands on — we will have a sea change. This is because more people will realize that one device may not be right for them or their needs — thus creating new markets for multiple types of computers that might otherwise be lumped together under one category: laptop vs notebook vs desktop vs tablet vs smartphone or something else entirely.
The big question is whether this will happen at a sufficiently rapid pace to matter: do I really want to buy two new computers within two weeks? How long do I want to pay for the extra convenience? Will technology allow me to use all my current devices? If there are no clear differences between different market segments anymore (which seems very likely), there won't be enough customers willing to pay extra money for even more convenience. In fact it could make sense in some cases to choose a less advanced type of computer instead of buying a new one… maybe simply trade your current computer for another model

         Computer
एक टिप्पणी भेजें (0)
और नया पुराने