Wednesday, August 26, 2020

Mental Models About a Person’s World Essay

Presentation: Meeting an individual just because, can either be a constructive or antagonistic experience and the manner in which somebody interfaces with this individual can likewise show both positive and negative practices. So the inquiry is, in what manner can mental models about a person’s world, both guide them and furthermore limit their discernments when meeting an individual just because. Through investigating how and why these recognitions can be helped and restricted, we can begin to scrutinize the thinking behind our psychological models. MENTAL MODELS Consistently, scholarly writing has characterized a psychological model from multiple points of view, anyway the most ideal approach to comprehend what a psychological model is, is the profoundly imbedded perspectives or even certain pictures, that trigger suspicions and speculations, at last influencing the manner in which an individual reacts as well or carries on the planet, be it towards an individual or a real existence circumstance (Senge 2006). A genuine case of a psychological model is, the speculation that lone rich individuals live in the eastern rural areas of Sydney. This speculation might be valid sometimes, yet in different cases, others may live there on the grounds that they have experienced their for their entire lives, thus, we can see this specific speculation or â€Å"mental model† has not be thoroughly considered. Not addressing mental models, can frequently prompt bogus speculations, this circumstance can likewise emerge when meeting an individual just be cause. When meeting an individual just because, our psychological models can help us both comprehend and at last coexist with the individual or they can restrict our discernments, which means we make presumptions or speculations that in the end modify our observations about this individual or how we act towards them. Frequently, we see that we are not deliberately mindful of our psychological models and the influences that they can have on our conduct (Chermack 2003), this thusly, confines our observations. Mental models are frequently dubious, inadequate and loosely communicated (Karp 2005) notwithstanding, once accepted, mental models are incredibly hard to change (Chermack 2003). This is exceptionally due to theâ fact that individuals are unconscious of their own psychological models, and the main route for an individual to change their psychological model, is for them to recognize that they have one to begin with. Mental models can be valuable as they can assist us with processing data and settle on choices rapidly (Unknown 1997) and they can likewise be basic establishments for building information about the world we live in (Karp 2005). For example, when an individual has a psychological model that all remove food is terrible for their wellbeing and prosperity, when parted with the choice of either having take food or a sound supper at home, the people mental model will in this manner lead them to rapidly choose to eat a solid dinner at home. Notwithstanding, solid mental models can thwart dynamic reasoning and the acknowledgment of new thoughts (Unknown 1997), and regularly emerge issues when they are inferred, implying that they are beneath the degree of mindfulness (Senge 1992). Utilizing the case of the Detroit vehicle creator, not perceiving that they had the psychological model that every one of that clients thought about was styling, accepting that â€Å"all individuals care about is styling†, clearly gives us that their psychological model had gotten inferred. This psychological model kept on being unexamined, and on the grounds that this psychological model remained unexamined, the model stayed unaltered, and in this way as the world changed the hole developed between the psychological model of this Detroit automaker and the world (Senge 1992). Unmistakably, mental models can proceed as channels that screen approaching data that come to us, constraining our perspectives and furthermore our discernments (Un known 1997). An individual’s mental model speaks to their view on the world, it likewise gives them the setting wherein they see and decipher new material and furthermore new individuals in which they meet just because (Kim 1993). It not just encourages us to comprehend what is happening around us, yet it can likewise limit our comprehension of a specific circumstance. For instance, when somebody has been named as not a pleasant individual, with never scrutinizing its legitimacy, individuals make a psychological model that, that individual isn't decent, thus when they do or say something pleasant it goes unnoticed, and along these lines, the conduct doesn't fit with the psychological model individuals have towards thisâ certain person. These untested presumptions or mental models can in the end cause struggle and errors between individuals. Creating aptitudes truth be told and request can help us in understanding our psychological models and furthermore with managing others. At the point when we use abilities of reflection we hinder our perspectives and recognize how our psychological models are framed and how they influence our conduct. Where as aptitudes of request, is worried about how we work in eye to eye circumstances with others, particularly when we are managing complex and conflictual issues (Senge 2006). Along with the devices and strategies used to build up these abilities these establish the center of the control of mental models, which comprises of; the qualifications between embraced hypotheses and speculations being used, perceiving â€Å"leaps of abstraction†, uncovering the â€Å"left-hand column† and adjusting request and backing (Senge 2006). At the point when an individual says that they worth or want something, that is known as upheld hypothesis, in any case, what they really state or do, is known as speculations being used (Bocham 2010). Recognizing the holes between what we state and what we do, can be viewed as a powerful intelligent ability in getting progressively mindful of our psychological models. Somebody may pronounce their view (upheld hypothesis) that individuals for the most part are reliable, however their activities (speculations being used) show in an unexpected way, as they never loan out cash and remain quiet about their assets (Senge 2006). As clear in the model above, there is a hole between the individual’s upheld hypothesis and their hypothesis being used. By perceiving the hole between embraced hypothesis and the hypothesis being used, learning can happen, as we as people question whether we truly esteem our upheld hypothesis (Senge 2006). At the point when we meet an individual just because, we can rapidly bounce into speculations as we never might suspect to address them. For instance, when we meet an individual and they state that they are a specialist, we consequently accept that they are brilliant, as it is a speculation that all specialists are savvy we never appear to scrutinize this psychological model. These are known as â€Å"leaps of abstraction†. â€Å"Leaps of abstraction† happen when we move from direct perceptions to speculations without addressing them, this ultimatelyâ impedes learning since it gets aphoristic, as what was previously a supposition that is currently rewarded as a reality (Senge 2006). Along these lines, this turns into another restriction, where mental models can have on our discernments when we meet individuals just because. Be that as it may, these â€Å"leaps of abstraction† can without much of a stretch be recognized when individuals ask what their speculation depends on and whether the speculation is erroneous or deluding (Senge 2006) Senge (2006) recognizes the â€Å"left-hand column† as a ground-breaking procedure whereby people start to perceive how their psychological models work in contrasting circumstances. This activity can show people that they in fact have mental models and give them how those models have a functioning impact in at times contrary associations with individuals, not exclusively do these individuals become mindful of their psychological models, yet they start to recognize why managing these presumptions is basic (Senge 2006). All together for good correspondence between people to emerge, individuals need to perceive that all together for the correspondence procedure to be successful, mental models must be overseen appropriately, this is finished by adjusting support and request (Peggy and Bronn 2003). Backing is the way toward imparting an individual’s perspectives and thinking in a way that makes it understood for other people (Peggy and Bronn 2003). When there is support without request, it just prompts more backing, and in this way prompts two people expressing their methods of thinking and thinking, the two of them are quick to here the others sees, yet don't ask into what they are stating in light of the fact that they accept that what they are stating is eventually the most ideal perspective. An approach to handle this, is through the procedure of request. Request connects with two people into the correspondence procedure in a joint learning process (Peggy and Bronn 2003). Here the goal is to comprehend the thinking and thinking about the other individual, this should be possible by approaching them inquiries with the goal for them to decide the root for their decisions and explanations (Peggy and Bronn 2003). People can do this by posing inquiries, for example, â€Å"What is it that drives you to that position?† and â€Å"can you represent your point for me?† (Senge 2006). In this way, it is evidentâ that getting a handle on the expertise of adjusting backing and request, is profoundly favorable in communicating with others, particularly those you meet just because. End: Along these lines, it is basic and exceptionally favorable for us to scrutinize our psychological models in regular circumstances, for example, meeting individuals just because, as it will stop us from naturally making suppositions and making speculations. Through recognizing ‘leaps of abstraction†, utilizing the â€Å"left-hand column† procedure and furthermore expressly acing the ability of adjusting promotion and request, we can figure out how to scrutinize these psychological models, and therefore addressing whether they truly hold their incentive in our reality. In this manner, when we meet an individual just because, before we make suppositions and speculations, we may need to perceive our imbedded mental models and figure out how to address them, subsequently supporting the procedure of correspondence to be a positive ex

Saturday, August 22, 2020

Foundations for Information Organization †Free Samples to Students

Question: Talk about the Foundations for Information Organization. Answer: Presentation: Association frameworks are components used to deliberately sort, group and store data for simpler recovery. Data sources come in various assortments, and we use them essentially in practically regular schedule of our lives. They run from significant archives got and sent on our day by day office schedules to email records and documents put away in our PCs for future recovery and use. Along these lines, my favored individual assortment is my significant paper archives that I use for every day exercises Paper reports assortment has a wide viewpoint. The assortment includes a wide assortment of records going from letters, Newspapers, Memos and Magazines among different reports. They are archives utilized every day, week by week or once in a while relying upon when they are required. Their association is huge as it saves money on time by making their entrance and recovery a lot simpler. Having data sources all around put away as per their classifications improves their quality and ease of use when they are required. An assortment of paper archives is a significant piece of most hierarchical arrangements, with classes, for example, the papers being day by day circulated in open workplaces, inner and outside updates sent inside associations, official and casual letters dispersed for correspondence purposes. In any case, it is critical to take note of that, a paper reports assortment is a type of information association framework, which falls under the kind of Classifications and Categories. It is a plan type which gives an example of limited standards to speak to the wonders in one major assortment. A paper reports assortment is accordingly, an umbrella facilitating an assortment of paper things with various purposes and uses, however with same attributes. They can be filled, physically put away and are they are all advise regarding printed versions. The Map Out of Paper Document Collection The table above diagrams a paper reports assortment association conspire. Furthermore, from the table, a thing, for example, letter is separated into formal and casual classifications, and notices are partitioned into outside and inward reminders. Magazines and papers are partitioned by the periods where they are printed, and flyers are arranged into the scholastic and general handouts utilized for general data. A paper archives assortment is a helpful resource with regards to data the executives. Since data never terminates, it is essential to have it for future reference. Also, unique authoritative circumstances require past data for references, and for this situation, a cautious stockpiling of the reports is compulsory for future reference The association of these archives is as a rule in type of records, put away as indicated by similar qualities. That is, paper records, for example, letters are put away in one document, and same applies to updates, papers magazines and some more. My paper reports assortment offers a wide scope of use, from being significant wellspring of information to a decent wellspring of reference. The assortment is described by a wide scope of data things under it. It is an umbrella lodging diverse data sources serving various capacities. The extent of these records under the assortment ranges from put away and sent duplicates of formal letters, duplicates of got and sent inward reminders, various duplicates of every day and week by week papers, duplicates of various month to month and yearly magazines lastly two arrangements of leaflets; one for scholastic use and the other one for general data utilization. Be that as it may, my paper archives assortment does exclude classified authoritative paper records as they are just confined to the particular people, and duplicates of my scholarly authentications and educational plan vitae, as they are kept distinctively in an uncommon spot on account of their own hugeness. The association of the things in my own assortment follows a specific models dependent on the kind of the thing, criticalness of the things, the date and fittingness. For example, all past conventional letter got are kept in a solitary record named, got, while all the duplicates of the sent letters are kept in another diverse archive document named got. All the day by day papers are kept in a Dailies paper record document while the Weekly papers are kept in a Weeklys document archive. All the month to month and the yearly magazines are kept in Monthly and Yearly archive documents separately and the at long last the handouts are arranged into two, specifically, General and Academic records. The classes for this situation are comprehensive and successful; they are not very numerous yet rather enough to provide food for all the things. Be that as it may, reminders can at present consummately fit in a similar class as letters yet for clearness purposes, they have gotten and Sent document s in which they are put away. In the association of these things into various classifications, various components affected my results. Generally, it is realized precision is critical to progress. What's more, sound judgment expresses that, order of things into one gathering as indicated by their attributes makes recognizable proof procedure a lot simpler. The association was likewise affected somewhat by the way that verifiable proof shows that, in order, it is a lot of adequate to bunch things or wonders as indicated by their incredibly comparative qualities into one. Besides, the most significant parts of my assortment I considered were the sort of the things, for example letters and papers, their quick hugeness and significance, the dates of their creation, that is masterminded by the dates in instances of papers and magazines, lastly as indicated by dependability, that is, the level of unwavering quality when it seeks references. One most significant part to note is that, this assortment isn't spoken to in shorthand or brief manners. Assessment of the Paper Document Collection The paper archives assortment is a steady association framework. Instances of breakdowns are practically incomprehensible. The assortment is portrayed by unmistakable paper reports which mean changing of the as of now printed data in the paper is incomprehensible, or any endeavors to do so would effectively be taken note. The main security worries that can cause the breakdowns in this data association are the robbery and expulsion of certain significant reports to hide data. Be that as it may, this framework can be best improved, by upgrading methods of getting to data a lot quicker, in light of the fact that it is very wild with regards to recovering data Paper archives assortment association framework perspectives, for example, the importance of the things can create significant turmoil, as the term pertinence varies starting with one individual then onto the next. It along these lines, implies that occasionally orchestrating data dependent on your own presumption of significance may not adjust to someone else who should utilize a similar arrangement of association. In any case, this may not be a potential issue to manage all things considered past fundamental arrangements, along these lines it is justifiable with respect to why there may consistently exist contrasts in what each individual esteems critical, and that adjustments for this situation are pointless. The assortment of paper records utilizes an assortment of procedures that can be likened to the term innovation. With the things put away in efficient individual records, the filling framework upgrades a deliberate game plan which makes it simpler with regards to the entrance of the data. Besides, the records inside which the things are put away are partitioned by date, that is, day, month and year which implies, while recovering the data, you realize where to begin from regarding the time of materials being gotten to from the association framework. The selection of these innovations has included shine the association conspire, with other luring credits ascribed to it as of now; innovation is only an additional favorable position in making the association plot much better. Innovation has consistently assumed a superior job in changing and reinforcing association frameworks, and having mechanical selections that fortifies the framework is required more for the advancement of the association framework. The depicted association framework can be named as a one of a kind mix of development and the current techniques for association strategies. The framework doesn't follow a specific set up data association framework; rather, there is mix of new procedures and the utilization of a portion of the set up techniques to shape a more grounded mix of association framework. It is regarded as a decent move in attempting to make a trustworthy data association framework. The expounded association framework is generally new, and has not been being used elsewhere, however it is a promising strategy association framework that puts a great deal of accentuation on security and simple access of the data on powerful ordering and characterizing of the things. It is a savvy arranging plan that can be utilized to supplement progressed mechanical association plans, consequently its reasonableness in this cutting edge innovative time. References Cordella, An., Iannacci, F. (2011). Data Systems and Organizations . Undergrad concentrate in Economics, Management, Finance and the Social Sciences , 1-33. Glushko, J. R. (2010). Section 1. Establishments for Information Organization, Retrieval, and Use . 1-20. Taylor, G. A., Joudrey, N. D. (2008). The Organization Of Information . Library and Information Science Text Series .

Thursday, August 20, 2020

Taking Some of the Stress out of Publishing in a Literary Magazine

Taking Some of the Stress out of Publishing in a Literary Magazine Literary magazines have served as gatekeepers for new writers since the first one was published in 1684 (Nouvelles de la république des lettres). Some writers, such as T.S. Eliot, were first discovered through publishing in a literary magazine, and most well-known writers have published in them at some point.From webzines to university-funded publications, to The Paris Reviewâ€"literary magazines have only increased in popularity over the past few decades, especially with the growth of online publishing. As a periodical devoted to literature, literary magazines (also known as literary journals) typically publish essays, poetry, short stories, interviews with authors, letters, literary criticism, book reviews and more.Why writers should publish in literary magazinesAs we mentioned before, many great writers have either gotten their start or increased their publishing opportunities significantly through publishing in literary magazines. Publishing in a literary magazineâ€"especially i f its a well-known oneâ€"significantly increases exposure of a writers work and will open up other publishing opportunities, as well. Additionally, many literary magazines offer contests, allowing new or un-agented writers the opportunity to get their work out there and earn credibility in the publishing world at large.First things first: Find the right literary magazineWith a wide scope of literary magazines available, most have a niche market as their readership, and look for a very specific genre or format of writing to include in their publication. That being said, one of the first steps you should take to ensure better odds of getting your writing published in a literary magazine is to find the right one out of the hundreds available. Some cater to very specific crowds (like mothers of young children or green living enthusiasts) while others have wider audiences. Some publish only a few times a year, while others publish quarterly.There are several online resources available to help writers sift through whats available and find the literary magazine(s) that best suits their intended submission. Although its a paid service, DuoTrope is another great resource to help writers find everything from the best literary magazines to publish in for their specific niche or genre to agents interested in potentially representing their work. With this narrowed scope, writers have a much better opportunity of getting published in the literary magazine of their choice.Second: Follow the rules and dont submit blindlyLiterary magazines are generally very good about being specific regarding the exact type of writing they want, how to submit it, and what to expect. Following the publications rules regarding submission plays a big role in increasing the writers chances of getting accepted. Many literary magazines are run as a side project, which means their editors often dont have time to sift through manuscripts that dont follow submission guidelines or dont sync with the pu blications overall feel and purpose.Most respected literary magazines and webzines provide detailed submission guidelines on their website. Some allow email submission of a manuscript while others want a hard copy and SASE (self-addressed return envelope). Some literary magazines will charge a submission fee, as well, so its important to look at all of the submission guidelines before making a choice regarding which ones you want to submit your work to.Next, learn the lingoNext, its important to learn the lingo of the literary magazine market. Here are a few terms you might encounter in your search for the right publication for your work:Simultaneous Submissions?â€" Simultaneous submissions are when a writer sends out his or her work to several magazines at once. Literary magazine editors will vary in their rules about simultaneous submissions: some will allow it while others are very clear they dont want a writer to do it. There are multiple reasons why an editor might not want sim ultaneous submissions, including issues of publishing rights, which well cover later in this article. If submission guidelines advise against simultaneous submissions, dont be tempted to do it anyway. Many editors know other editors within the world of literary magazine publishing, so you dont want to get started on the wrong foot with any of them by not following this request.Withdrawal?â€" This is the process you will need to go through if simultaneous submissions are acceptable and a literary magazine decides to publish your work. Usually, you can submit withdrawals of your manuscript via email or online, but some literary magazines have more formal ways of doing it. Refer to a magazines submission guidelines for more details about their preferences.Reprints?â€" While most literary magazines prefer to be the first to publish a particular piece, some will offer publication of reprints, or work that has been previously published elsewhere.First Serial?Rights â€" First serial rights are the rights held by a publication to publish a piece for the first time. After publication, the writer may then resell the piece to another publication.Non-exclusive / Exclusive Rights ?â€" Non-exclusive rights are rights held by the publisher to publish your work while acknowledging that your work can also be printed elsewhere. Exclusive rights are the opposite, in that the literary magazine or publication owns exclusive rights to your work and it cannot be published anywhere else, including on your author website.Know the slush pile and how to get out of itBrigid Hughes, former Executive Editor of The Paris Review, stated in an interview that the publication receives between 15,000 to 20,000 submissions in a year. Considering these numbers, its important to understand the dreaded literary magazine slush pile and what to expect of it. The slush pile is the pile (whether literal pile of paper or digital pile) of unsolicited manuscripts sent in by writers wishing to be published in the literary magazine or webzine. Especially for the most well-known and exclusive publications, this slush pile is not the editors primary concern, and will often take a while to get to any manuscript within it. Further, the larger publications have readers who go through the slush pile, which means the editors might never see your manuscript in the first place.To end up in the non-slush pile for these exclusive publications, youll need to either have been solicited from the editors to submit your work, have an agent, or have published with the magazine before. But since most writers seek first-time publication in these literary magazines, its important to a) be patient as your manuscript makes it through the slush pile process and b) follow submission guidelines and magazine content style to the letter to increase your chances of surviving the slush pile. Simply put, busy slush pile readers might pass over great writing simply because its not formatted correctly, doesnt fit wit h the publications scope of content, or wasnt submitted following submission guidelines.If your work doesnt get acceptedâ€"keep tryingWith so many literary magazines and webzines currently in print or online, getting published in a literary magazine has never been easier. However, most editors of literary magazines have a very specific type of piece or writing style theyre looking for. If your manuscript is rejectedâ€"or worse, you just dont hear back at allâ€"take heart in knowing that the more manuscripts you send out, the better your chances are at getting accepted for publication in a literary magazine.Another benefit to querying multiple publications (that are likely to reject you) is youll have several different opportunities to receive feedback on your work from experts in the industry. This type of feedback is invaluable for a writer and should always be received graciously. This is especially true since some editors will simply reject your work without explaining why, while others will give you a general excuse, such as: Your work does not fit our publications goals at this time.

Sunday, May 24, 2020

Pornography Consumption And Sexual Behaviors As Correlates...

The article I chose to review is â€Å"Pornography consumption and sexual behaviors as correlates of erotic dreams and nocturnal emissions.† The author of this interesting article is Calvin Kai – Ching Yu of Hong Kong Shue Yan University, North Point, Hong Kong. The document type is a journal article and it was published by Educational Publishing Foundation in October of 2012. The type of study that was done for this article was empirical study and quantitative study. â€Å"The study presented here was geared toward exploring the degree to which the frequencies of sex and wet dreams are modulated by sexual behaviors during wakefulness, including viewing pornography.† The study examined the degree to which erotic or wet dreams are modulated y sexual behaviors and the use of pornography in your waking life. Since it had been noted that erotic dreams are much more prevalent in men than in woman and it was going to be much harder to factor in both genders data into t he analysis of sex dreams, he only used male participants for this study. The real purpose of this article and study was to see I pornography, sexual fantasy, erotic dreams, and masturbation had any significant effect on sexual behaviors awake or sleep. To do so Yu had to put together some kind of method to test a group of men on, in order to find out the answer to his question at hand. Yu did find some earlier studies that were done by Schredl and King. These studies suggested that sexual content of dreams did in some way

Wednesday, May 13, 2020

Using the French Expletive Connard in Conversation

The French noun connard  (pronounced kuh-nar) is an informal term commonly used as an expletive. Loosely translated, it means idiot or jerk, although most people understand it to mean something more obscene. As with all slang, its important to understand what youre saying before you go using it in everyday conversation. You and your friends may toss around expletives like connard  knowing that youre joking. But you probably wouldnt want to use such language in a formal situation or in front of strangers. Translation and Usage A more direct translation of  connard  would be a--hole or any number of variations on the f-word. A French person searching for a synonym might choose imbà ©cile  or  crà ©tin.  There is also  a feminine version:  une connarde / une connasse  for cow. Examples of Usage Here are a few examples for context. To be clear, we are not recommending using this term. But it will be useful to understand it because it can be overheard on the streets of any French city or town. Cest un vrai connard ! Hes a real jerk!Tu es le  connard  de lautre nuit.  Casse-toi ! Youre the [expletive] from the other night. Get away!Et je suppose que le sale  connard  veut quelque chose en à ©change.  Ã‚  And I assume the dirty  [expletive] wants something in return.Ecoute, tu devenais un incroyable  connard.   Listen, you were becoming/being an unbelievable jerk.Babe Ruth à ©tait un  connard, mais le baseball reste gà ©nial.  Ã‚  Babe Ruth was  [expletive], but baseballs still  beautiful.Tu peux pas me parler, espà ¨ce de  connard.   You cant talk to me, you son of an  [expletive].Cest pas toi qui pose  la question,  connard.  Ã‚  Youre not asking the question,  [expletive].Vandalisme, arme blanche : Ten prends pour six mois,  connard.   Vandalism, deadly weapon. You get six months in lockup,  [expletive].Ouai, ben, soul ou sobre, tes toujours un  connard.  Ã‚  Yeah, well, drunk or sober, youre still an [expletive].

Wednesday, May 6, 2020

Evolution of Microprocessor Free Essays

string(101) " very sophisticated things that a microprocessor does, but those †¢ are its three basic activities\." American University CSIS 550 History of Computing Professor Tim Bergin Technology Research Paper: Microprocessors Beatrice A. Muganda AU ID: 0719604 May 3, 2001 -2- EVOLUTION OF THE MICROPROCESSOR INTRODUCTION The Collegiate Webster dictionary describes microprocessor as a computer processor contained on an integrated-circuit chip. In the mid-seventies, a microprocessor was defined as a central processing unit (CPU) realized on a LSI (large-scale integration) chip, operating at a clock frequency of 1 to 5 MHz and constituting an 8-bit system (Heffer, 1986). We will write a custom essay sample on Evolution of Microprocessor or any similar topic only for you Order Now It was a single component having the ability to perform a wide variety of different functions. Because of their relatively low cost and small size, the microprocessors permitted the use of digital computers in many areas where the use of the preceding mainframe—and even minicomputers— would not be practical and affordable (Computer, 1996). Many non-technical people associate microprocessors with only PCs yet there are thousands of appliances that have a microprocessor embedded in them— telephone, dishwasher, microwave, clock radio, etc. In these items, the microprocessor acts primarily as a controller and may not be known to the user. The Breakthrough in Microprocessors The switching units in computers that were used in the early 1940s were the mechanical relays. These were devices that opened and closed as they did the calculations. Such mechanical relays were used in Zuse’s machines of the 1930s. -3- Come the 1950s, and the vacuum tubes took over. The Atanasoff-Berry Computer (ABC) used vacuum tubes as its switching units rather than relays. The switch from mechanical relay to vacuum tubes was an important technological advance as vacuum tubes could perform calculations considerably faster and more efficient than relay machines. However, this technological advance was short-lived because the tubes could not be made smaller than they were being made and had to be placed close to each other because they generated heat (Freiberger and Swaine, 1984). Then came the transistor which was acknowledged as a revolutionary development. In â€Å"Fire in the Valley†, the authors describe the transistor as a device which was the result of a series of developments in the applications of physics. The transistor changed the computer from a giant electronic brain to a commodity like a TV set. This innovation was awarded to three scientists: John Bardeen, Walter Brattain, and William Shockley. As a result of the technological breakthrough of transistors, the introduction of minicomputers of the 1960s and the personal computer revolution of the 1970s was made possible. However, researchers did not stop at transistors. They wanted a device that could perform more complex tasks—a device that could integrate a number of transistors into a more complex circuit. Hence, the terminology, integrated circuits or ICs. Because physically they were tiny chips of silicon, they came to be also referred to as chips. Initially, the demand for ICs was typically the military and aerospace -4- industries which were great users of computers and who were the only industries that could afford computers (Freiberger and Swaine, 1984). Later, Marcian â€Å"Ted† Hoff, an engineer at Intel, developed a sophisticated chip. This chip could extract data from its memory and interpret the data as an instruction. The term that evolved to describe such a device was â€Å"microprocessor†. Therefore, the term â€Å"microprocessor† first came into use at Intel in 1972 (Noyce, 1981). A microprocessor was nothing more than an extension of the arithmetic and logic IC chips corporating more functions into one chip (Freiberger and Swaine, 1984). Today, the term still refers to an LSI single-chip processor capable of carrying out many of the basic operations of a digital computer. Infact, the microprocessors of the late eighties and early nineties are full-sclae 32-bit and 32-bit address systems, operating at clock cycles of 25 to 50 MHz (Heffer, 1986). What led to the development of microprocessors? As stated above, microprocessors essentially evolved from mechanical relays to integrated circuits. It is important to illustrate here what aspects of the computing industry led to the development of microprocessors. (1) Digital computer technology In the History of Computing class, we studied, throughout the semester, how the computer industry learned how to make large, complex digital computers capable of processing more data and also how to build and use smaller, less -5- expensive computers. The digital computer technology had been growing steadily since the late 1940s. (2) Semiconductors Like the digital computer technology, semiconductors had also been growing steadily since the invention of the transistor in the late 1940s. The 1960s saw the integrated circuit develop from just a few transistors to many complicated tasks, all of the same chip. (3) The calculator industry It appears as if this industry grew overnight during the 1970s from the simplest of four-function calculators to very complex programmable scientific and financial machines. From all this, one idea became obvious—if there was an inexpensive digital computer, there would be no need to keep designing different, specialized integrated circuits. The inexpensive digital computer could simply be reprogrammed to perform whatever was the latest brainstorm, and there would be the new product (Freiberger and Swaine, 1984). The development of microprocessors can be attributed to when, in the early 1970s, digital computers and integrated circuits reached the required levels of capability. However, the early microprocessor did not meet all the goals: it was too expensive for many applications, especially those in the consumer market, and it -6- could not hold enough information to perform many of the tasks being handled by the minicomputers of that time. How a microprocessor works According to Krutz (1980), a microprocessor executes a collection of machine instructions that tell the processor what to do. Based on the instructions, a microprocessor does three basic things: †¢ Using its ALU (Arithmetic/Logic Unit), a microprocessor can perform mathematical operations like addition, subtraction, multiplication and division. Modern microprocessors contain complete floating point processors that can perform extremely sophisticated operations on large floating point numbers. †¢ A microprocessor can move data from one memory location to another. A microprocessor can make decisions and jump to a new set of instructions based on those decisions. There may be very sophisticated things that a microprocessor does, but those †¢ are its three basic activities. You read "Evolution of Microprocessor" in category "Papers" Put simply, it fetches instructions from memory, interprets (decodes) them, and then executes whatever functions the instructions direct. For example, if the microprocessor is capable of 256 different operations, there must be 256 different instruction words. When fetched, each instruction word is interpreted differently than any of the other 255. Each type of microprocessor has a unique instruction set (Short, 1987). -7- Archictecture of a microprocessor This is about as simple as a microprocessor gets. It has the following characteristics: †¢ an address bus (that may be 8, 16 or 32 bits wide) that sends an address to memory; †¢ a data bus (that may be 8, 16 or 32 bits wide) that can send data to memory or receive data from memory; †¢ RD (Read) and WR (Write) line to tell the memory whether it wants to set or get the addressed location; †¢ a clock line that lets a clock pulse sequence the processor; and a reset line that resets the program counter to zero (or whatever) and restarts execution. †¢ A typical microprocessor, therefore, consists of: logical components—enable it to function a s a programmable logic processor; program counter, stack, and instruction register—provide for the management of a program; the ALU—provide for the manipulation of data; and a decoder timing and control unit—specify and coordinate the operation of other components. The connection of the microprocessors to other units—memory and I/O devices—is done with the Address, Data, and control buses. -8- Generation of microprocessors Microprocessors were categorized into five generations: first, second, third, fourth, and fifth generations. Their characteristics are described below: First-generation The microprocessors that were introduced in 1971 to 1972 were referred to as the first generation systems. First-generation microprocessors processed their instructions serially—they fetched the instruction, decoded it, then executed it. When an instruction was completed, the microprocessor updated the instruction pointer and fetched the next instruction, performing this sequential drill for each instruction in turn. Second generation By the late 1970s (specifically 1973), enough transistors were available on the IC to usher in the second generation of microprocessor sophistication: 16-bit arithmetic and pipelined instruction processing. Motorola’s MC68000 microprocessor, introduced in 1979, is an example. Another example is Intel’s 8080. This generation is defined by overlapped fetch, decode, and execute steps (Computer 1996). As the first instruction is processed in the execution unit, the second instruction is decoded and the third instruction is fetched. The distinction between the first and second generation devices was primarily the use of newer semiconductor technology to fabricate the chips. This new -9- technology resulted in a five-fold increase in instruction, execution, speed, and higher chip densities. Third generation The third generation, introduced in 1978, was represented by Intel’s 8086 and the Zilog Z8000, which were 16-bit processors with minicomputer-like performance. The third generation came about as IC transistor counts approached 250,000. Motorola’s MC68020, for example, incorporated an on-chip cache for the first time and the depth of the pipeline increased to five or more stages. This generation of microprocessors was different from the previous ones in that all major workstation manufacturers began developing their own RISC-based microprocessor architectures (Computer, 1996). Fourth generation As the workstation companies converted from commercial microprocessors to in-house designs, microprocessors entered their fourth generation with designs surpassing a million transistors. Leading-edge microprocessors such as Intel’s 80960CA and Motorola’s 88100 could issue and retire more than one instruction per clock cycle (Computer, 1996). Fifth generation Microprocessors in their fifth generation, employed decoupled super scalar processing, and their design soon surpassed 10 million transistors. In this – 10 – generation, PCs are a low-margin, high-volume-business dominated by a single microprocessor (Computer, 1996). Companies associated with microprocessors Overall, Intel Corporation dominated the microprocessor area even though other companies like Texas Instruments, Motorola, etc also introduced some microprocessors. Listed below are the microprocessors that each company created. (A) Intel As indicated previously, Intel Corporation dominated the microprocessor technology and is generally acknowledged as the company that introduced the microprocessor successfully into the market. Its first microprocessor was the 4004, in 1971. The 4004 took the integrated circuit one step further by ocating all the components of a computer (CPU, memory and input and output controls) on a minuscule chip. It evolved from a development effort for a calculator chip set. Previously, the IC had had to be manufactured to fit a special purpose, now only one microprocessor could be manufactured and then programmed to meet any number of demands. The 4004 microprocessor was the central component in a four-chip set, called the 4004 Family: 4001 – 2,048-bit ROM, a 4002 – 320-bit RAM, and a 4003 – 10-bit I/O shift register. The 4004 had 46 instructions, using only 2,300 transistors in a 16-pin DIP. It ran at a clock rate of – 11 – 740kHz (eight clock cycles per CPU cycle of 10. 8 microseconds)—the original goal was 1MHz, to allow it to compute BCD arithmetic as fast (per digit) as a 1960’s era IBM 1620 (Computer, 1996). Following in 1972 was the 4040 which was an enhanced version of the 4004, with an additional 14 instructions, 8K program space, and interrupt abilities (including shadows of the first 8 registers). In the same year, the 8008 was introduced. It had a 14-bit PC. The 8008 was intended as a terminal controller and was quite similar to the 4040. The 8008 increased the 4004’s word length from four to eight bits, and doubled the volume of information that could be processed (Heath, 1991). In April 1974, 8080, the successor to 8008 was introduced. It was the first device with the speed and power to make the microprocessor an important tool for the designer. It quickly became accepted as the standard 8-bit machine. It was the first Intel microprocessor announced before it was actually available. It represented such an improvement over existing designs that the company wanted to give customers adequate lead time to design the part into new products. The use of 8080 in personal computers and small business computers was initiated in 1975 by MITS’s Altair microcomputer. A kit selling for $395 enabled many individuals to have computers in their own homes (Computer, 1996). Following closely, in 1976, was 8048, the first 8-bit single-chip microcomputer. It was also designed as a microcontroller rather than a microprocessor—low cost and small size was the main goal. For this reason, data was stored on-chip, while program code was external. The 8048 was eventually replaced by the very popular but bizarre 8051 and 8052 – 12 – (available with on-chip program ROMs). While the 8048 used 1-byte instructions, the 8051 had a more flexible 2-byte instruction set, eight 8-bit registers plus an accumulator A. Data space was 128 bytes and could be accessed directly or indirectly by a register, plus another 128 above that in the 8052 which could only be accessed indirectly (usually for a stack) (Computer, 1996). In 1978, Intel introduced its high-performance, 16-bit MOS processor—the 8086. This microprocessor offered power, speed, and features far beyond the second-generation machines of the mid-70’s. It is said that the personal computer revolution did not really start until the 8088 processor was created. This chip became the most ubiquitous in the computer industry when IBM chose it for its first PC (Frieberger and Swaine, 1984 ). In 1982, the 80286 (also known as 286) was next and was the first Intel processor that could run all the software written for its predecessor, the 8088. Many novices were introduced to desktop computing with a â€Å"286 machine† and it became the dominant chip of its time. It contained 130,000 transistors. In 1985, the first multi-tasking chip, the 386 (80386) was created. This multitasking ability allowed Windows to do more than one function at a time. This 32-bit microprocessor was designed for applications requiring high CPU performance. In addition to providing access to the 32-bit world, the 80386 addressed 2 other important issues: it provided system-level support to systems designers, and it was object-code compatible with the entire family of 8086 microprocessors (Computer, 1996 ). The 80386 was made up of 6 functional units: (i) execution unit (ii) segment unit (iii) page unit (iv) decode unit (v) bus unit and (vi) prefetch unit. The 80386 had – 13 – 34 registers divided into such categories as general-purpose registers, debug registers, and test registers. It had 275,000 transistors (Noyce, 1981). The 486 (80486) generation of chips really advanced the point-and-click revolution. It was also the first chip to offer a built-in math coprocessor, which gave the central processor the ability to do complex math calculations. The 486 had more than a million transistors. In 1993, when Intel lost a bid to trademark the 586, to protect its brand from being copied by other companies, it coined the name Pentium for its next generation of chips and there began the Pentium series—Pentium Classic, Pentium II, III and currently, 4. (B) Motorola The MC68000 was the first 32-bit microprocessor introduced by Motorola in early 1980s. This was followed by higher levels of functionality on the microprocessor chip in the MC68000 series. For example, MC68020, introduced later, had 3 times as many transistors, was about three times as big, and was significantly faster. Motorola 68000 was one of the second generation systems that was developed in 1973. It was known for its graphics capabilities. The Motorola 88000 (originally named the 78000) is a 32-bit processor, one of the first load-store CPUs based on a Harvard Architecture (Noyce, 1981). C) Digital Equipment Corporation (DEC) – 14 – In March 1974, Digital Equipment Corporation (DEC) announced it would offer a series of microprocessor modules built around the Intel 8008. (D) Texas Instruments (TI) A precursor to these microprocessors was the 16-bit Texas Instruments 1900 microprocessor which was introduced in 1976. The Texas Instruments TMS370 is similar t o the 8051, another of TI’s creations. The only difference between the two was the addition of a B accumulator and some 16-bit support. Microprocessors Today Technology has been changing at a rapid pace. Everyday a new product is made to make life a little easier. The computer plays a major role in the lives of most people. It allows a person to do practically anything. The Internet enables the user to gain more knowledge at a much faster pace compared to researching through books. The portion of the computer that allows it to do more work than a simple computer is the microprocessor. Microprocessor has brought electronics into a new era and caused component manufacturers and end-users to rethink the role of the computer. What was once a giant machine attended by specialists in a room of its own is now a tiny device conveniently transparent to users of automobile, games, instruments, office equipment, and a large array of other products. – 15 – From their humble beginnings 25 years ago, microprocessors have proliferated into an astounding range of chips, powering devices ranging from telephones to supercomputers (PC Magazine, 1996). Today, microprocessors for personal computers get widespread attention—and have enabled Intel to become the world’s largest semiconductor maker. In addition, embedded microprocessors are at the heart of a diverse range of devices that have become staples of affluent consumers worldwide. The impact of the microprocessor, however, goes far deeper than new and improved products. It is altering the structure of our society by changing how we gather and use information, how we communicate with one another, and how and where we work. Computer users want fast memory in their PCs, but most do not want to pay a premium for it. Manufacturing of microprocessors Economical manufacturing of microprocessors requires mass production. Microprocessors are constructed by depositing and removing thin layers of conducting, insulating, and semiconducting materials in hundreds of separate steps. Nearly every layer must be patterned accurately into the shape of transistors and other electronic elements. Usually this is done by photolithography, which projects the pattern of the electronic circuit onto a coating that changes when exposed to light. Because these patterns are smaller than the shortest wavelength of visible light, short wavelength ultraviolet radiation must be used. Microprocessor features 16 – are so small and precise that a single speck of dust can destroy the microprocessor. Microprocessors are made in filtered clean rooms where the air may be a million times cleaner than in a typical home (PC World, 2000)). Performance of microprocessors The number of transistors available has a huge effect on the performance of a processor. As seen earlier, a typical instruction in a processor like an 8088 took 1 5 clock cycles to execute. Because of the design of the multiplier, it took approximately 80 cycles just to do one 16-bit multiplication on the 8088. With more transistors, much more powerful multipliers capable of single-cycle speeds become possible ( ). More transistors also allow a technology called pipelining. In a pipelined architecture, instruction execution overlaps. So even though it might take 5 clock cycles to execute each instruction, there can be 5 instructions in various stages of execution simultaneously. That way it looks like one instruction completes every clock cycle (PC World, 2001). Many modern processors have multiple instruction decoders, each with its own pipeline. This allows multiple instruction streams, which means more than one instruction can complete during each clock cycle. This technique can be quite complex to implement, so it takes lots of transistors. The trend in processor design has been toward full 32-bit ALUs with fast floating point processors built in and pipelined execution with multiple instruction streams. There has also been a tendency toward special instructions (like the MMX – 17 – instructions) that make certain operations particularly efficient. There has also been the addition of hardware virtual memory support and L1 caching on the processor chip. All of these trends push up the transistor count, leading to the multi-million transistor powerhouses available today. These processors can execute about one billion instructions per second! (PC World, 2000) ) With all the different types of Pentium microprocessors, what is the difference? Three basic characteristics stand out: †¢ †¢ †¢ Instruction set: The set of instructions that the microprocessor can execute. bandwidth: The number of bits processed in a single instruction. clock speed: Given in megahertz (MHz), the clock speed determines how many instructions per second the processor can execute. In addition to bandwidth and clock speed, microprocessors are classified as being either RISC (reduced instruction set computer) or CISC (complex instruction set computer). – 18 – Other uses of microprocessors There are many uses for microprocessors in the world today. Most appliances found around the house are operated by microprocessors. Most modern factories are fully automated—that means that most jobs are done by a computer. Automobiles, trains, subways, planes, and even taxi services require the use of many microprocessors. In short, there are microprocessors everywhere you go. Another common place to find microprocessors is a car. This is especially applicable to sports cars. There are numerous uses for a microprocessor in cars. First of all, it controls the warning LED signs. Whenever there is a problem, low oil, for example, it has detectors that tell it that the oil is below a certain amount. It then reaches over and starts blinking the LED until the problem is fixed. Another use is in the suspension system. A processor, controls the amount of pressure applied to keep the car leveled. During turns, a processor, slows down the wheels on the inner side of the curb and speeds them up on the outside to keep the speed constant and make a smooth turn. An interesting story appeared in the New York Times dated April 16 and goes to show that there’s no limit to what microprocessors can do and that resarchers and scientists are not stopping at the current uses of microprocessors. The next time the milk is low in the refrigerator, the grocery store may deliver a new gallon before it is entirely gone. Masahiro Sone, who lives in Raleigh, N. C. , has won a patent for a refrigerator with an inventory processing system that keeps track of what is inside – 19 – and what is about to run out and can ring up the grocery store to order more (NY Times, 2001). Where is the industry of microprocessors going? Almost immediately after their introduction, microprocessors became the heart of the personal computer. Since then, the improvements have come at an amazing pace. The 4004 ran at 108 kHz—that’s kilohertz, not megahertz—and processed only 4 bits of data at a time. Today’s microprocessors and the computers that run on them are thousands of times faster. Effectively, they’ve come pretty close to fulfilling Moore’s Law (named after Intel cofounder Gordon Moore), which states that the number of transistors on a chip will double every 18 months or so. Performance has increased at nearly the same rate (PC Magazine, 1998 ). Can the pace continue? Well, nothing can increase forever. But according to Gerry Parker, Intel’s executive vice president in charge of manufacturing, â€Å"we are far from the end of the line in terms of microprocessor performance. In fact, we’re constantly seeing new advances in technology, one example being new forms of lithography that let designers position electronic components closer and closer together on their chips. Processors are created now using a 0. 35-micron process. But next year we’ll see processors created at 0. 25 microns, with 0. 18 and 0. 13 microns to be introduced in the years to come. † (PC Magainze, 1998) However, it’s not just improvements in lithography and density that can boost performance. Designers can create microprocessors with more layers of metal tying – 20 – together the transistors and other circuit elements. The more layers, the more compact the design. But these ultracompact microprocessors are also harder to manufacture and validate. New chip designs take up less space, resulting in more chips per wafer. The original Pentium (60/66 MHz) was 294 square millimeters, then it was 164 square millimeters (75/90/100 MHz), and now it’s 91 square millimeters (133- to 200-MHz versions) (PC Magazine, 1998). When will all this end? Interestingly, it may not be the natural limits of technology that will eventually refute Moore’s Law. Instead, it’s more likely to be the cost of each successive generation. Every new level of advancement costs more as making microprocessor development is a hugely capital-intensive business. Currently, a fabrication plant with the capacity to create about 40,000 wafers a month costs some $2 billion. And the rapid pace of innovations means equipment can become obsolete in just a few years. Still, there are ways of cutting some costs, such as converting from today’s 8-inch silicon wafers to larger, 300-mm (roughly 12inch) wafers, which can produce 2. 3 times as many chips per wafer as those now in use. Moving to 300-mm wafers will cost Intel about $500 million in initial capital. Still, nothing lasts forever. As Parker notes, â€Å"the PC industry is built on the assumption that we can get more and more out of the PC with each generation, keep costs in check, and continue adding more value. We will run out of money before we run out of technology. When we can’t hold costs down anymore, then it will be a different business† (PC Magazine, 1998). At the beginning of last year, the buzz was about PlayStation 2 and the Emotion Engine processor that would run it. Developed by Sony and Toshiba, – 21 – experts predicted the high-tech processor would offer unprecedented gaming power and more importantly, could provide the processing power for the PlayStation 2 to challenge cheap PCs as the entry-level device of choice for home access to the Web. PlayStation2 is equipped with the 295MHz MIPS-based Emotion engine, Sony’s own CPU designed with Toshiba Corp. , a 147MHz graphics processor that renders 75 million pixels per second, a DVD player, an IEEE 1394 serial connection, and two USB ports. Sony will use DVD discs for game titles and gives consumers the option of using the product for gaming, DVD movie playing and eventually Web surfing (PC World, 2000). Soon, instead of catching up on the news via radio or a newspaper on the way to work, commuters may soon be watching it on a handheld computer or cell phone. Early January this year, Toshiba America Electronic Components announced its TC35273XB chip. The chip has 12Mb of integrated memory and an encoder and decoder for MPEG-4, an audio-video compression standard. According to Toshiba, the integrated memory is what sets this chip apart from others. With integrated memory, the chip consumes less power, making it a good fit for portable gadgets. This chip is designed to specifically address the issues of battery life which can be very short with portable devices. The chip will have a RISC processor at its core and running at a clock speed of 70MHz (PC World, 2000). Toshiba anticipates that samples of this chip will be released to manufacturers in the second quarter, and mass production will follow in the third quarter. Shortly after this release, new handheld computers and cell phones using the chip and offering streaming media will be expected (CNET news). – 22 – It is reported in CNET news, that in February this year, IBM started a program to use the Internet to speed custom-chip design, bolstering its unit that makes semiconductors for other companies. IBM, one of the biggest makers of application-specific chips, would set up a system so that chip designs are placed in a secure environment on the Web, where a customer’s design team and IBM engineers would collaborate on the blueprints and make changes in real time. Designing custom chips, which are used to provide unique features that standard processors don’t offer, requires time-consuming exchanges of details between the clients that provide a basic framework and the IBM employees who do the back-end work. Using the Internet will speed the process and make plans more accurate. IBM figures that since their customers ask for better turnaround time and better customer satisfaction, this would be one way to tackle this. As a pilot program, this service was to be offered to a set of particular, selected customers initially, and then would include customers who design the so-called system-on-a-chip devices that combine several functions on one chip (CNET news). A new microprocessor unveiled in February 2000 by Japan’s NEC, offers high-capacity performance while only consuming small amounts of power, making it ideal for use in mobile devices. This prototype could serve as the model for future mobile processors. The MP98 processor contains four microprocessors on the same chip that work together in such a way that they can be switched on and off depending on the job in hand. For example, a single processor can be used to handle easy jobs, such as data entry, through a keypad, while more can be brought – 23 – online as the task demands, with all four working on tasks such as processing video. This gives designers of portable devices the best of both worlds—low power consumption and high capacity (PC World, 2000). However, it should be noted that the idea of putting several processors together on a single chip is not new as both IBM and Sun Microsystems have developed similar devices. The only difference is that MP98 is the first working example of a â€Å"fine grain† device that offers better performance. Commercial products based on this technology are likely to be seen around 2003 (PCWorld, 2000). In PCWorld, it was reported that, last September, a Japanese dentist received U. S. and Japanese patents for a method of planting a microchip into a false tooth. The one-chip microprocessor embedded in a plate denture can be detected using a radio transmitter-receiver, allowing its owner to be identified. This is useful in senior citizen’s home where all dentures are usually collected from their owners after meals, washed together and returned. In such a case, it is important to identify all the dentures to give back to their correct owners without any mistake (PC World, 2000). In March this year, Advanced Micro Devices (AMD) launched its 1. 3-GHz Athlon processor. Tests on this processor indicated that its speed surpassed Intel’s 1. GHz Pentium 4. The Athlon processor has a 266-MHz front side bus that works with systems that use 266-MHz memory. The price starts from $2,988 (PCWorld, 2001). Intel’s Pentium 4, which was launched in late 2000, is designed to provide blazing speed—especially in handling multimedia content. Dubbed Intel NetBurst – 24 – Micro-architecture, it is designed to speed up applic ations that send data in bursts, such as screaming media, MP3 playback, and video compression. Even before the dust had settled on NetBurst, Intel released its much awaited 1. GHz Pentium 4 processor on Monday, April 23. The is said to be the company’s highest-performance microprocessor for desktops. Currently priced at $325 in 1,000 unit quantities. The vice president and general manager of Intel was quoted as saying, â€Å"the Pentium 4 processor is destined to become the center of the digital world. Whether encoding video and MP3 files, doing financial analysis, or experiencing the latest internet technologies—the Pentium 4 processor is designed to meet the needs of all users† (PC World, 2001). Gordon Moore, co-founder of Intel, over thirty years ago, announced that the number of transistors that can be placed on a silicon would double every two years. Intel maintains that it has remained true since the release of its first processors, the 4004, in 1971. The competition to determine who has produced the fastest and smallest processor between Intel and AMD continues. Infact, Intel Corp. predicts that PC chips will climb to more than 10GHz from today’s 1GHz standard by the year 2011. However, researchers are paying increasing attention to software. That’s because new generations of software, especially computing-intensive user interfaces, will call for processors with expanded capabilities and performance. How to cite Evolution of Microprocessor, Papers

Tuesday, May 5, 2020

It seems that there is an ever

It seems that there is an ever-increasing trend in Essay our society. Big corporations are becoming more and more influential in our lives. As they gain more and more muscle in our government they also invade our schools and many other facets of our lives. Perhaps the most disturbing area of potential influence, however, is corporate control of the media. Can the American media uphold its values of free press under pressure from big corporations? Can they continue to present the absolute truth? The simple answer, especially in my opinion, is no. The movie The Insider provides us with an excellent case to back that point of view. Perhaps one of the biggest stories of this decade has been the tobacco industry. We saw them stand before Congress and tell the world that cigarettes were not addictive. The industry was able to â€Å"lawyer† its way out of trouble time and time again. They essentially used legal maneuvers, and certainly money, to keep the truth from the American people. Finally, we saw all that come to an end. When Jeff Wigand decided it was time to tell the truth, he put everything he valued at risk. He stood to loose his family, any chance at a job, and quite possibly his life. He knew all these things and still he went on, because he thought he could make a difference. He knew that his testimony would never be heard in a court of law, so where could he turn. The answer: the fourth and fifth estates, or the press and television. Every night millions of Americans sit down and watch the nightly news or read the paper. We know that we will be told all the days news, that we will be educated about what is happening in the world around us. We also know that we will be updated on issues that we care about as individuals and a society. Another delivery mechanism for information is television magazine shows like 60 Minutes. People know that when Mike Wallace talks to them, they should listen. They can also look at his reputation and know that he is telling the truth. Wigand put faith in that fact. Wigand agreed to do an interview with 60 Minutes because he knew that people would listen. He knew that the absolute truth would finally be out in the open, and that it would come from a source that people would believe. He risked everything because he had faith in the media and journalists. What happened next is, quite frankly, disgraceful. When the tobacco industry, specifically the company BW learned that CBS intended to air the interview, they began to lean on the CBS corporate office. They threatened lawsuits that could quite possibly mean that BW would end up owning CBS. There were other factors as well. Westinghouse was about to purchase CBS. That meant that corporate managers stood to make lots of money. A lawsuit with BW could have easily made Westinghouse pull out of the deal, and people would loose money. Essentially the whole situation came down to an issue of money. CBS News decided not to air the story. It is quite obvious that they did not make that decision based upon any journalistic issues. They were being leaned on by the corporate office, which was looking at dollar signs. CBS News was setting aside the truth for money, something it never should have even considered doing. Than goodness that Lowell Bergman was there to stop the lunacy. He correctly pointed out that the CBS corporate office had no right to tell CBS News what stories it could and could not air. The truth is the truth, no matter who it damages. Bergman embarked on a crusade to see that the whole story was aired. .ubde140af71af3ca772b30a69339a3a56 , .ubde140af71af3ca772b30a69339a3a56 .postImageUrl , .ubde140af71af3ca772b30a69339a3a56 .centered-text-area { min-height: 80px; position: relative; } .ubde140af71af3ca772b30a69339a3a56 , .ubde140af71af3ca772b30a69339a3a56:hover , .ubde140af71af3ca772b30a69339a3a56:visited , .ubde140af71af3ca772b30a69339a3a56:active { border:0!important; } .ubde140af71af3ca772b30a69339a3a56 .clearfix:after { content: ""; display: table; clear: both; } .ubde140af71af3ca772b30a69339a3a56 { display: block; transition: background-color 250ms; webkit-transition: background-color 250ms; width: 100%; opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #95A5A6; } .ubde140af71af3ca772b30a69339a3a56:active , .ubde140af71af3ca772b30a69339a3a56:hover { opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #2C3E50; } .ubde140af71af3ca772b30a69339a3a56 .centered-text-area { width: 100%; position: relative ; } .ubde140af71af3ca772b30a69339a3a56 .ctaText { border-bottom: 0 solid #fff; color: #2980B9; font-size: 16px; font-weight: bold; margin: 0; padding: 0; text-decoration: underline; } .ubde140af71af3ca772b30a69339a3a56 .postTitle { color: #FFFFFF; font-size: 16px; font-weight: 600; margin: 0; padding: 0; width: 100%; } .ubde140af71af3ca772b30a69339a3a56 .ctaButton { background-color: #7F8C8D!important; color: #2980B9; border: none; border-radius: 3px; box-shadow: none; font-size: 14px; font-weight: bold; line-height: 26px; moz-border-radius: 3px; text-align: center; text-decoration: none; text-shadow: none; width: 80px; min-height: 80px; background: url(https://artscolumbia.org/wp-content/plugins/intelly-related-posts/assets/images/simple-arrow.png)no-repeat; position: absolute; right: 0; top: 0; } .ubde140af71af3ca772b30a69339a3a56:hover .ctaButton { background-color: #34495E!important; } .ubde140af71af3ca772b30a69339a3a56 .centered-text { display: table; height: 80px; padding-left : 18px; top: 0; } .ubde140af71af3ca772b30a69339a3a56 .ubde140af71af3ca772b30a69339a3a56-content { display: table-cell; margin: 0; padding: 0; padding-right: 108px; position: relative; vertical-align: middle; width: 100%; } .ubde140af71af3ca772b30a69339a3a56:after { content: ""; display: block; clear: both; } READ: No-Calorie Powder May Substitute for Food's Fat EssayEventually CBS did air the entire interview. The only did so after receiving sharp criticism in The New York Times. The Washington Post, also showed that the smear campaign that CBS was using as justification for not airing the story, was nothing more than trumped up charges. CBS was left looking quite nasty, and decided to show the interview. The whole point is that business has no right to decide what is news. They have no right to come and stop a