spell IT

new

Disclaimer

Pentru publicarea articolelor in cadrul acestei reviste, ele trebuiesc furnizate in format electronic, trebuie  avizate de catre unul din membrii echipei editoriale si presupune automat din acel moment asumarea de catre autor a raspunderii privind originalitatea continutului articolului respectiv.

« Decembrie 2024 »
Decembrie
LuMaMiJoViSaDu
1
2345678
9101112131415
16171819202122
23242526272829
3031
 
Prima pagina Revista Numarul 3 - an scolar 2011 - 2012

Sari la continut | Sari la navigare

Numarul 3

Index
  1. Cea mai buna recomandare in materie de pro-ductivitate
    ing. Constantin Avramescu
  2. Marimea Universului
    prof. Carmen Avramescu
  3. Saptamana altfel
    prof. Carmen Avramescu
  4. Sorarea prin selectie arborescenta
    prof. Neli Secita
  5. Tabele in HTML
    prof. Neli Secita
  6. ETHERNET. Definitie & Istoric
    prof. Carmen Avramescu
  7. Studiu privind introducerea instrumentelor IT in evaluarea elevilor cu nevoi speciale
    prof. Irimia Crisalinda Nona
  8. Origins of the Internet
    prof. Stefan Ferent
  9. School Reform in the Information Age
    prof. Stefan Ferent
  10. Computer Evolution
    prof. Stefan Ferent
  11. Lumina şi întunericul - realităţi fizice şi metafizice contrastante??
    profesor, inginer Teodora Palaghia
  12. ALGORITMI DE SORTARE UTILIZÂND METODA DIVIDE ET IMPERA - studiu
    prof. Neli Secita

Cea mai buna recomandare in materie de pro-ductivitate

ing. Constantin Avramescu

Sa fim e-ficienti!

In orice ghid de crestere a eficientei lucrului nu se spune ca trebuie sa indepartam sau sa oprim cat mai multe surse ce ne pot distrage de la munca. 

        In cazul lucrului la computer nu este intotdeauna atat de usor precum pare la inceput.

        Ni se explica ca trebuie sa separam partea de munca efectiva de partea de divertisment ce o putem accesa in pauzele de munca (messenger, facebook, twitter, etc.). Una din metodele cele mai des invocate si garantata conform studiilor este de a folosi 2 sau mai multe monitoare legate la acelasi computer. Dar cred ca la multi dintre noi ar fi mai simplu sa abordam aceasta metoda acasa decat sa-i cerem sefului un al 2-lea monitor si o placa video corespunzatoare, nu ?

        O varianta ieftina la cea de mai sus ar fi de a folosi mai multe desktop-uri virtuale (unele distributiile Linux au facut implicita optiunea de multi-desktop inca de acum vreo 10 ani). In acest sens va recomand aplicatia gratuita Dexpot; are si o varianta portabila ce o puteti purta cu dvs. pe un stick USB.

Pentru ca nu mai putem fara internet, browser-ele au devenit printre cele mai importante programe de pe calculatoare / telefoane / tablete / ceasuri, ochelari, portofele sau ce o mai urma sa mai apara inter-conectat.       

Dar mie cea mai buna recomandare pe care am gasit-o de curand este sa folosim 2 (sau mai multe) browsere simultan. In unul din ele stam conectati pe retelele de socializare, pe conturile de IM (webmessenger.yahoo.com, meebo.com,  etc.), jocuri online, sa citim breaking news-uri sau chiar mondene, sa primim fluxuri RSS, sa chat-uim, etc. iar in cel de-al doilea DOAR SA LUCRAM. Acum, de la caz la caz si, bineinteles, dupa gusturi si obiceiuri, perechea de browsere poate sa difere dar imi permit sa fac totusi unele recomandari:

Google Chrome - daca exista o munca preponderent de office (multe emailuri pe gmail.com, documente pe Google Docs, evenimente si task-uri sincronizate intre mai multe dispozitive) 

Safari - pentru munca de cercetare si documentare pe internet

Firefox - pentru webdesign, SEO si in general datorita multitudinii de plugin-uri pentru orice activitate daca este bine customizat acelei activitati.

Opera - nu prea il folosesc dar ma mir de fiecare data de rapiditatea cu care incarca unele pagini.

Internet Explorer (7, 8, ...) - nu il recomand decat pentru ANAF + e-licitatii ;)

        Ar mai fi si alte browsere "specializate" pe anumite domenii dar deja mi se pare prea lung acest articol pentru a fi citit online; cat despre printarea lui, chiar nu va recomand.

        Trei motive in plus sa folositi mai mult de 2 browsere:

- nu intrati in conturile de e-banking de pe acelasi browser cu care mai dati click-uri si pe link-uri nesigure

- nu stati autentificati in contul de Google atunci cand  navigati pe internet in scopul de a cauta si colecta informatii pe o anumita tema (nu trebuie sa fim fanatici si sa ne temem de ce stie Google despre noi dar este clar ca cei de la Google "ofera" rezultate si reclame pe placul fiecaruia dintre noi pentru a fi mai usor click-uite)

- atunci cand un site cunoscut nu se incarca sau se incarca greu, incercati-l si cu un al 2-lea browser. 

 

Spor !

 

Sursa: http://www.helpware.ro/blog/2012/04/01/cea-mai-buna-recomandare-in-materie-de-pro-ductivitate

Marimea Universului

prof. Carmen Avramescu

link-uri recomandate

Va recomand cu caldura aceasta animatie care ne prezinta, intr-un mod intuitiv si atragator, dimensiunea universului in care traim atat la nivel macro cat si la nivel micro raportat la noi.

 

http://apod.nasa.gov/apod/image/1203/scaleofuniverse_huang.swf?bordercolor=white

 

Saptamana altfel

prof. Carmen Avramescu

m.socrative.com

Socrative este un sistem inteligent de colectare a raspunsurilor elevilor care le permite profesorilor sa-si imbogateasca orele printr-o serie de exercitii educationale si jocuri prin intermediul telefoanelor inteligente, a laptop-urilor sau a tabletelor.

Fences

Fences este o aplicatie Windows care va organizeaza iconitele pe de desktop in sectiuni pe care le puteti numi si muta dupa propria dorinta si nevoie. Este extrem de indemanos sa iti pastrezi desktop-ul curat si bine organizat si pe deasupra arata si foarte bine!

Open Source

http://opensourcewindows.org/

http://portableapps.com/

http://www.makeuseof.com/tag/7-free-open-source-programs-replace-commercial-windows-software/

http://downloadpedia.org/Open_Source_Windows

http://osliving.com/

Sorarea prin selectie arborescenta

prof. Neli Secita

O alta metoda de sortare: prin selectie arborescenta

Principiile sortării prin selecţie arborescentă sunt uşor de înţeles gândindu-ne la un turneu cu meciuri eliminatorii a unei competiţii de tenis.

o-alta-metoda-de-sortare-prin-selectie-arborescenta Dimensiune 348.1 kB (application/pdf)

Tabele in HTML

prof. Neli Secita

ETHERNET. Definitie & Istoric

prof. Carmen Avramescu

Definitie

 

Majoritatea oamenilor asociaza Ethernet-ul cu un simplu cablu ce are la capete mufe asemanatoare cu cele de la telefon. Dar este chiar asa ? Ethernet-ul este mult mai mult de atat! Ethernet-ul este fundamentul care asigura accesul global la Internet si, fara indoiala, cea mai utilizata tehnologie de conectivitate la nivel mondial.

“Ethernet-ul este printre primele tehnologii care au impact in viata de zi cu zi a oameniilor de pe intreg globul” gasim precizat pe situl IEEE.

Aproape fiecare trimitere la "retea", "LAN", "conexiune LAN" sau "placa de retea" implica Ethernet. Definit de IEEE ca standard 802.3, metoda de acces Ethernet este utilizata pentru a conecta computere intr-o companie sau retea de domiciliu, precum si pentru a conecta un singur calculator la un modem de cablu sau modem DSL pentru acces la Internet.

Mai putem privi Ethernet-ul ca fiind cea mai populara familie de protocoale si scheme de cablare pentru retele locale(LAN) pentru a conecta computere, imprimante, modemuri etc sau ca standardul global de legare prin cablu a mai multor computere intr-o retea.

Ethernet-ul este, prin urmare, tehnologia de retea (locala - LAN) cea mai mult folosita.

Ethernet este o tehnologie de retea bazata pe cadre pentru retelele locale (LANs). Defineste cablari si semnale pentru stratul fizic si formate de cadre si protocoale pentru controlul accesului la mediu (MAC) / nivelul legatura de date a modelului OSI. Ethernet-ul este standardizat in cea mai mare parte ca IEEE 802.3. A devenit cea mai raspandita tehnologie LAN utilizata din anii ’90 pana in prezent si a inlocuit, in mare masura, toate celelalte standarde LAN, cum ar fi token ring, FDDI, si ATM.

Exista multe motive pentru succesul Ethernet-ului. In primul rand, Ethernet a fost prima tehnologie de mare viteza larg raspandita pentru retelele locale. Pentru ca a fost implementata mai devreme, administratorii de retea s-au familiarizat cu Ethernet si au fost reticenti sa treaca la alte tehnologii de LAN, atunci cand acestea au aparut. In al doilea rand, Token Ring, FDDI, ATM-uri sunt mai complexe si mai costisitoare decat Ethernet, fapt care a descurajat mai mult administratorii de retea. In al treilea rand, motivul cel mai important pentru a trece la o alta tehnologie LAN (cum ar fi FDDI sau ATM) a fost, de obicei, rata de transfer mai mare a acestora. Ethernet-ul a „luptat” insa mereu producand versiuni noi care functionau la rate egale sau mai mari de transfer. “Switched Ethernet” a fost introdus la inceputul anilor 1990 fapt care a crescut si mai mult eficienta ratelor sale de tranfer de date. In cele din urma, pentru ca Ethernet a fost popularitatea sa. “Ethernet hardware” (in special, placile de retea), fiind si extrem de ieftin, a devenit un produs intalnit pe mai toate calculatoarele personale. Acest cost este, de asemenea, datorat faptului ca protocolul de acces multiplu al Ethernet-ului, CSMA / CD, este complet descentralizat.

Sistemele care comunica prin Ethernet impart un flux de date in bucati mai mici numite cadre (frames). Fiecare cadru contine adresele sursei si ale destinatiei si un cod de control de erori astfel incat cadrele cu erori sa poata fi detectate si retransmise. In ceea ce priveste modelul OSI modelul Ethernet ofera servicii pana la nivelul legatura de date inclusiv.

Istoric

Ethernet a fost initial dezvoltat ca unul dintre proiectele de pionierat ale celor de la Xerox PARC.

Robert Metcalfe a fost un membru al personalului de cercetare pentru Xerox, la centrul lor Palo Alto Research Center (PARC). Metcalfe a fost solicitat pentru a construi un sistem de retea pentru a interconecta calculatoarele Xerox ALTO din centrul Parc. Acestea erau statii de lucru cu interfata grafica cu utilizatorul. Xerox isi dorea aceasta retea deoarece in aceeasi perioada ei construiau primele imprimante cu laser din lume si vroiau ca toate calculatoarele Parc sa poata tipari la aceste cateva imprimante.

Robert Metcalfe a avut doua provocari: reteaua trebuia sa fie suficient de rapida pentru a folosi foarte rapidele imprimante laser si a trebuit sa conecteze sute de calculatoare din aceeasi cladire. Niciodata pana in acel moment nu se mai gasisera atatea calculatoare in aceeasi cladire; la acea vreme nimeni nu avea mai mult de unul, doua sau poate trei calculatoare in functiune.

Presa a afirmat de multe ori ca Ethernet a fost inventat la 22 mai 1973, cand Robert Metcalfe a scris un memoriu catre sefii sai de la PARC despre potentialul Ethernetului. Metcalfe sustine ca,  de fapt, Ethernet-ul  a fost inventat pe o perioada de mai multi ani. In 1976, Metcalfe si David Boggs (asistentul lui Metcalfe) au publicat un document intitulat „Ethernet: pachete de comutare distribuite pentru retele locale”.

Metcalfe a denumit prima sa retea experimentala Alto Aloha Network. In 1973 Metcalfe a schimbat numele in "Ethernet", pentru a face clar ca sistemul ar putea sprijini orice computer - nu doar Alto - si sa subliniez ca noile sale mecanisme de retea au evoluat mult dincolo de sistemul Aloha (o retea de calculatoare dezvoltata de catre cei de la Universitatea din Hawaii, la inceputul anilor 70, in care pachetele de date erau transmise pe unde radio). Metcalfe a inlocuit transmisia radio cu cea prin intermediul unui cablu coaxial gros. El a anticipat ca si alte medii de transmisie ar putea fi utilizate in viitor. „Deci, noi nu l-am numit CoaxNet. L-am numit Ethernet pentru ca eterul (Ether - aceasta este o referire la eterul luminifer prin care fizicienii din secolul al 19-lea credeau ca lumina calatoreste) ar putea fi coaxial, torsadat, radio, cabluri optice, Powerline, sau ce vrei ", a spus Metcalfe.

Imaginea de mai jos, celebra de altfel, este desenul initial al lui Metcalfe si reprezinta Ethernet-ul sau original. Acesta includea un cablu de interfata care conecta adaptorul Ethernet (interfata) pentru un transmitator extern.

Studiu privind introducerea instrumentelor IT in evaluarea elevilor cu nevoi speciale

prof. Irimia Crisalinda Nona

prezentare

Realitatea şi dinamica vieţii sociale, economice sau culturale au impus ample schimbări în proiectarea şi implementarea politicilor şi strategiilor educaţionale. Dezideratele de modernizare şi perfecţionare a metodologiei didactice se încriu pe direcţia sporirii caracterului activ-participativ al metodei de învăţare, pe aplicarea unor metode cu un pronunţat caracter formativ în valorificarea noilor tehnologii instrucţionale, care să-l implice pe copil direct în procesul de învăţare, stimulându-i creativitatea, interesul pentru nou, dezvoltarea gândirii, reuşind să aducă o însemnată contribuţie la dezvoltarea întregului său potenţial. Metodele constituie instrumente didactice cu ajutorul cărora, profesorul, îi determină pe cei aflaţi pe băncile şcolii la un demers de asimilare activă a unor noi cunoştinţe şi forme comportamentale, de stimulare şi, în acelaşi timp de dezvoltare a forţelor cognitive, intelectuale. Într-o versiune mai modernă, metoda este interpretată drept o modalitate pe care profesorul o urmează pentru a-i face pe elevi să găsească singuri calea proprie de urmat în redescoperirea adevărurilor sale. O astfel de metodă este şi metoda predării prin proiecte.

studiu Dimensiune 404.2 kB (application/pdf)

Origins of the Internet

prof. Stefan Ferent

articol in limba engleza

The first recorded description of the social interactions that could be enabled through networking was a series of memos written by J.C.R. Licklider of MIT in August 1962 discussing his "Galactic Network" concept. He envisioned a globally interconnected set of computers through which everyone could quickly access data and programs from any site. In spirit, the concept was very much like the Internet of today. Licklider was the first head of the computer research program at DARPA, starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this networking concept.
Leonard Kleinrock at MIT published the first paper on packet switching theory in July 1961 and the first book on the subject in 1964. Kleinrock convinced Roberts of the theoretical feasibility of communications using packets rather than circuits, which was a major step along the path towards computer networking. The other key step was to make the computers talk together. To explore this, in 1965 working with Thomas Merrill, Roberts connected the TX-2 computer in Mass. to the Q-32 in California with a low speed dial-up telephone line creating the first (however small) wide-area computer network ever built. The result of this experiment was the realization that the time-shared computers could work well together, running programs and retrieving data as necessary on the remote machine, but that the circuit switched telephone system was totally inadequate for the job. Kleinrock's conviction of the need for packet switching was confirmed.
In late 1966 Roberts went to DARPA to develop the computer network concept and quickly put together his plan for the "ARPANET", publishing it in 1967. At the conference where he presented the paper, there was also a paper on a packet network concept from the UK by Donald Davies and Roger Scantlebury of NPL. Scantlebury told Roberts about the NPL work as well as that of Paul Baran and others at RAND. The RAND group had written a paper on packet switching networks for secure voice in the military in 1964. It happened that the work at MIT (1961-1967), at RAND (1962-1965), and at NPL (1964-1967) had all proceeded in parallel without any of the researchers knowing about the other work. The word "packet" was adopted from the work at NPL and the proposed line speed to be used in the ARPANET design was upgraded from 2.4 kbps to 50 kbps.
In August 1968, after Roberts and the DARPA funded community had refined the overall structure and specifications for the ARPANET, an RFQ was released by DARPA for the development of one of the key components, the packet switches called Interface Message Processors (IMP's). The RFQ was won in December 1968 by a group headed by Frank Heart at Bolt Beranek and Newman (BBN). As the BBN team worked on the IMP's with Bob Kahn playing a major role in the overall ARPANET architectural design, the network topology and economics were designed and optimized by Roberts working with Howard Frank and his team at Network Analysis Corporation, and the network measurement system was prepared by Kleinrock's team at UCLA.
Due to Kleinrock's early development of packet switching theory and his focus on analysis, design and measurement, his Network Measurement Center at UCLA was selected to be the first node on the ARPANET. All this came together in September 1969 when BBN installed the first IMP at UCLA and the first host computer was connected. Doug Engelbart's project on "Augmentation of Human Intellect" (which included NLS, an early hypertext system) at Stanford Research Institute (SRI) provided a second node. SRI supported the Network Information Center, led by Elizabeth (Jake) Feinler and including functions such as maintaining tables of host name to address mapping as well as a directory of the RFC's.
One month later, when SRI was connected to the ARPANET, the first host-to-host message was sent from Kleinrock's laboratory to SRI. Two more nodes were added at UC Santa Barbara and University of Utah. These last two nodes incorporated application visualization projects, with Glen Culler and Burton Fried at UCSB investigating methods for display of mathematical functions using storage displays to deal with the problem of refresh over the net, and Robert Taylor and Ivan Sutherland at Utah investigating methods of 3-D representations over the net. Thus, by the end of 1969, four host computers were connected together into the initial ARPANET, and the budding Internet was off the ground. Even at this early stage, it should be noted that the networking research incorporated both work on the underlying network and work on how to utilize the network. This tradition continues to this day.
Computers were added quickly to the ARPANET during the following years, and work proceeded on completing a functionally complete Host-to-Host protocol and other network software. In December 1970 the Network Working Group (NWG) working under S. Crocker finished the initial ARPANET Host-to-Host protocol, called the Network Control Protocol (NCP). As the ARPANET sites completed implementing NCP during the period 1971-1972, the network users finally could begin to develop applications.
In October 1972, Kahn organized a large, very successful demonstration of the ARPANET at the International Computer Communication Conference (ICCC). This was the first public demonstration of this new network technology to the public. It was also in 1972 that the initial "hot" application, electronic mail, was introduced. In March Ray Tomlinson at BBN wrote the basic email message send and read software, motivated by the need of the ARPANET developers for an easy coordination mechanism. In July, Roberts expanded its utility by writing the first email utility program to list, selectively read, file, forward, and respond to messages. From there email took off as the largest network application for over a decade. This was a harbinger of the kind of activity we see on the World Wide Web today, namely, the enormous growth of all kinds of "people-to-people" traffic.

School Reform in the Information Age

prof. Stefan Ferent

Technology and Schooling

    Technology has always been an important part of schooling in America, but until recently the technology employed was rather simple and changed slowly. No one reading this article can remember when there were no textbooks, but the kind of textbooks we have today are largely products of the 20th century. Nor did teachers always have their primary tools – the blackboard and chalk. Slate blackboards did not appear in urban schools until the 1830s.
    When I was a young boy, one of the rituals at the start of the school year was a trip to the local department store to purchase school supplies: a “Big Chief” tablet, pencils, rubber erasers, pens with removable points (they became dull quickly), and a bottle of ink. Sometimes a pencil box would be added so that I could keep track of my personal supplies. Parents and students today go through similar shopping rituals each year. The technology has changed somewhat (ball-point pens have replaced ink and straight pens, pencil boxes have given way to backpacks), but it is essentially the same.
    There have been many attempts to change the technology of schooling. They have each appeared with great fanfare and expressions of optimism by advocates. In the 1920s, radio was expected to have a major impact on schools; in the 1930s, it was to be film; in the 1950s, television; and in the 1960s, teaching machines. The one piece of new technology from those bygone years that truly found a place was the overhead projector. Introduced in the 1940s by the military, it gradually found its way into the schools. The overhead projector is easy to use and relatively inexpensive, it permits the teacher to prepare notes in advance of class and to project them onto the screen for all to see, and it can be used without darkening the room or turning one’s back to the students. In many ways it is the perfect technology for supporting the kind of instruction that takes place in most classrooms today.
    More advanced technology has hit the schools at about the same time as have ideas for school restructuring and findings from the cognitive sciences. According to Karen Sheingold, “The successful transformation of student learning and accomplish-ment in the next decade requires effectively bringing together three agendas – an emerging consensus about learning and teaching, well-integrated uses of technology, and restructuring. Each agenda alone presents possibilities for educational redesign of a very powerful sort. Yet none has realized or is likely to realize its potential in the absence of the other two.” I agree.
    Skeptics will argue that we are merely going through another cycle of reform. School reforms come almost every decade; the schools absorb as many of the new ideas as they want and reject the rest. The result is that schools change very little where it truly counts – in the classroom. But the synergy of school restructuring, new forms of learning and teaching, and new technology will make the difference this time.
    The forces driving the Information Age seem irresistible. It is impossible both to participate fully in the culture and yet resist its defining features. Thus, if the schools are an “immovable object” (and I don’t believe they are), they are beginning to meet the “irresistible force” – Information Age technology.
    The analogy I carry in my head is that of a volcano erupting in Hawaii, spewing forth ash and lava. We have all seen pictures of such eruptions and what follows. The lava slowly oozes its way down the mountain toward the sea. No device or structure raised by human beings can block it. It either consumes all obstacles in fire or rolls over them. Finally, the lava reaches the sea – nature’s immovable object. Throughout the process there is a lot of noise, smoke, and steam that can distract one’s attention from the fundamental process that is taking place: the transformation of the landscape. In the most dramatic cases, entirely new islands appear. A volcanic eruption changes the environment in unpredictable ways; it is also irresistible.
    Information Age technology is like that volcano. It is changing the landscape of American culture in ways we either take for granted or scarcely notice. There are holdouts. Many of us see no need for placing telephones in our cars or buying mobile telephones. Some believe that television is a corrupting influence and refuse to have a set in their homes. I know such people; I am largely sympathetic to their views. But most people who think television can be corrosive buy one anyway and try to control its use.
    I cannot predict how schools will accommodate themselves to the force of computers and other electronic technologies. Some schools will move more quickly than others; some teachers will not change at all. The process may be slow enough that many teachers will be able to retire before they are forced to change. Some will quit teaching, and it is likely that some will remain anachronisms in a greatly altered school environment – antiques of a sort, surrounded by modernity but refusing even to use the telephones in their classrooms.
    But schools will change! I don’t know whether teachers will use the new technologies in the ways constructivists anticipate; other reformers have urged teachers to adopt similar progressive ideas in the past with mostly negative results. Perhaps technology will support constructivist approaches and make learner-centered instruction a practice as well as a theory this time. I don’t know whether schools will have site-based management or some other kind of organizational structure. Other theories of learning and school organization will certainly appear. The exact shape of future schools is unclear, but of this I am certain; schools will be unable to resist the new technology. The new technology will be used in schools because it appeals to students and may enhance learning and because the schools can offer no reasonable defense for rejecting it.
    The use of the new technologies will have a profound effect on schools. The very relationship between students and teachers will be challenged because the technologies enable learners to gain control of their own learning. In the past, schools have been places where people in authority decided what would be taught (and possibly learned), at what age, and in what sequence. They also decided what would not be taught – what would not be approved knowledge. The new technologies provide students access to information that was once under the control of teachers.
    Years ago, as a high school teacher, I received a note from a colleague who was teaching a course in American history for the first time. He had given students reading assignments from one set of books while he turned to other books as sources for his lectures. The note said, “The game is up. The students know where I am getting my information.” That is happening everywhere today, and the game is truly up. No teacher can compete with the power and the capability of the new technology as a presenter of information. If teachers and schools try to sustain that role, they will be whipped. On the other hand, no teachers will be replaced by a machine unless they attempt to do only what the machine can do better.
    It may be that the technology will be used most extensively first by privately financed schools, such as Sylvan Learning Systems, Kaplan Educational Centers, or the schools of the Edison Project. Privately financed schools that successfully demonstrate the value of technology may provide the incentive to persuade public institutions of the instructional value of technology. Perhaps public schools that employ the new technologies successfully in restructured environments will begin as magnet schools or even charter schools; if they succeed, then the use of technology may spread to the remainder of the schools in a district. Possibly the technological challenge to public education will come from home schooling, when parents discover that through technology they not only retain the current advantage of home schooling but also gain access to the academic resources of the public schools and of the world.
    The genie is out of the bottle. It is no longer necessary to learn about the American War of Independence by sitting in Mrs. Smith’s classroom and hearing her version of it. There are more powerful and efficient ways to learn about the Revolutionary War, and they are all potentially under the control of the learner. Either schools will come to terms with this fact, or schools will be ignored.
    It has never been easy for schools to change, and it is not going to be easy now. The current reform effort has been compared to changing a tire on a car that is continuing to speed down the highway. The job is actually much harder than that, because it is not repair but transformation that is required. It is more akin to changing a car into an airplane while continuing to drive the car. We are asking schools to become something different, without a clear picture of what the new institution should look like, even as we continue to satisfy the public that the old purposes of schooling are being served as well as or better than in the past.
    Availability and Use of Technology in Schools Today
    No one knows for certain what kind of technology exists in schools, how it is used, how much it is used, whether what exists is actually available to teachers, and whether what exists is broken, worn-out, or still in unopened boxes. It is hard enough to maintain an up-to-date inventory within a given school district without trying to do the same for the nation. Various individuals and organizations have conducted surveys on technology use, and these provide some clues as to the situation generally.
    Computers. We know that the number of computers in schools has grown enormously since 1983. At that time it was estimated that there were fewer than 50,000 computers in the nation’s schools; by 1994 the estimate was revised to 5.5 million. In 1981 only about 18% of schools had one or more computers for instructional use; by 1994 this figure had risen to 98%. There is hardly a school in America today without at least one computer.
    These figures tell us very little about student access to computers, however. In 1985 the median number of computers in K-6 elementary schools that used computers was three; that number rose to about 18 in 1989. In high schools for the same two years the numbers were 16 and 39 respectively. By 1994 the ratio of students to computers across all grades was 14 to 1. Thus, while there has been rapid growth in the number of computers in each school, the opportunity for a typical student to have access to a computer is still limited. For example, as late as 1989 a student might have had access to a computer for one hour per week – about 4% of instructional time.
    A second issue concerns the location of computers and how they are used. The most common pattern in schools is to cluster 20 or so machines in a single laboratory and then to schedule classes for time in the lab once a week. A decade ago computers were used mainly to teach programming, to teach about computers (computer literacy), and to run drill-and-practice exercises. More recently, computers have been used for enrichment, as work tools, and – less frequently – for purposes of computer literacy. However, computers in elementary schools continue to be used heavily to teach basic skills, and this pattern is growing in high schools. Federal funds for at-risk children have been a major source of school funding for computers, so it is hardly surprising that schools rely on them primarily for teaching basic skills and for remedial instruction. The use of computers to support instruction in the academic areas or to allow students independent exploration is sharply limited. Indeed, many American students have more access to a computer at home than at school.
    Video. Video use in schools seems to be growing and taking different forms. Instructional television, in which a program is broadcast to schools at scheduled times during the day from a state-operated or district-run studio, continues to exist, but it is not as significant as in the past. Many of these broadcasts were developed nationally through a consortium led by the Agency for Instructional Technology. The programs were designed to fit the school curriculum as determined by the state departments of education that were the most prominent consortium members.
    As a result of federal financing through the Star Schools program, many schools are able to use courses delivered nationwide by satellite and originating from a single source at a predetermined time. These programs typically feature courses that are difficult for small schools to offer on their own, e.g., courses in German or Japanese or advanced courses in mathematics and the sciences. Rural schools in particular have taken advantage of these offerings; about one-third of all rural schools have the capability of receiving satellite broadcasts.
    The Corporation for Public Broadcasting is developing new programming for schools, and the Learning Channel and the Discovery Channel both provide programs that offer useful information for schools.
    As a result of this proliferation of educational programming, the VCR has become a nearly ubiquitous piece of school technology. Virtually every school in the United States has at least one, and many teachers routinely collect tapes to use with their classes. Because it is more flexible and user friendly, the videotape has taken the place of film for instruction.
    CD-ROM and videodiscs offer other ways for schools to employ video. The use of these media, while still limited, is growing rapidly. According to Quality Education Data, Inc., 26% of all school districts had videodisc technology in 1994, as compared to 18% in 1992-93.
    Results. It would be wonderful if we could point to specific data that would demonstrate conclusively that the use of one technology or approach produced better results than the use of some other technology or approach. Alas, the problem is not so simple.
    First, the existence of a particular technology does not prescribe the way in which it will be used. Yet how a technology is actually used is critically important. One English teacher might use computers mainly for drill on grammar and spelling, while another English teacher might allow students to use the computers for word processing.
    Much of the evaluation research on media use is based on a specific intervention and focuses on short-term results. It seeks to determine, for example, whether the students receiving computer-assisted instruction (CAI) perform better than do those in a control group. In studies of this kind, the experimental group nearly always wins, but seldom does the investigator study the two groups a year or two later to find out if the gain has survived. Studies of short-term results, though interesting, are of marginal value to policy makers.
    What we need are studies of an altogether different order. When students and teachers are immersed in technology over time, will we detect changes in how students learn and how teachers teach? While it may be important to see some gain on a particular test, those who are trying to reform schools have larger goals in mind. Before we spend billions of dollars to equip every student with a computer at home and one at school and before we spend millions to equip teachers and to provide them with the necessary training, we need to know whether such a colossal investment of public funds makes sense.

    The Future of Technology in the Schools

    Thus far I have focused on the technology available to schools today. What about the future? We are only at the threshold of the Information Age. Tools we now treat as technical marvels will seem primitive in five years. Commodore Pets, IBM PC jrs, and the first Apple machines are throwaway items today. We can predict with certainty that technology will become faster, cheaper, more powerful, and easier to use. We can also predict that new devices that we can scarcely imagine today will be on the market before the end of this decade. Schools that expect to invest in a single computer system and then forget about technology purchases for several years will be surprised and disappointed. Schools must make decisions regarding additions and/or upgrades to their technology every year, in line with their own strategic plans.
    Without going into detail regarding specific pieces of hardware, I can say with confidence that schools should expect more integration, interaction, and intelligence from future technology. In their early days in school, computers and video were regarded as separate entities, and it was assumed they stay that way. In fact, we can expect a continuing integration of these technologies. Voice, data and images will be brought together into one package. One current example of this process is desk-top video. In a single, relatively inexpensive unit, one has telephone (voice), computer (data storage and manipulation), and video (sending and receiving moving images) capabilities. Those who use the machine can talk to people at a distance, exchange documents, work collaboratively, and even see their collaborators on screen.
    Technology will also become more interactive. In the field of distance learning, rather than rely strictly on one-way video and two-way audio communication, teachers and students will see one another simultaneously, thereby making distance learning more like face-to-face classroom interaction. Computer-based instruction will also be designed to respond to learners’ interests and abilities, giving them greater control over what they need to learn and the pace at which they learn it. And computer searches, which can now be bewildering to the casual user, will become easier and more responsive to what a user needs. Greater interactivity will make instructional programs even more powerful than they are today.
 

Computer Evolution

prof. Stefan Ferent

The Fifth Generation and the PC

Fifth Generation (Present and Beyond)
Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. The most famous example of a fifth generation computer is the fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. HAL performed all of the functions currently envisioned for real-life fifth generation computers. With artificial intelligence, HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from its own experiences. (Unfortunately, HAL was a little too human and had a psychotic breakdown, commandeering a spaceship and killing most humans on board.)
Though the wayward HAL9000 may be far from the reach of real-life computer designers, many of its functions are not. Using recent engineering advances, computers are able to accept spoken word instructions (voice recognition) and imitate human reasoning. The ability to translate a foreign language is also moderately possible with fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.
Many advances in the science of computer design and technology are coming together to enable the creation of fifth-generation computers. Two such engineering advances are parallel processing, which replaces von Neumann's single central processing unit design with a system harnessing the power of many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electricity with little or no resistance, greatly improving the speed of information flow. Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient's needs. It will take several more years of development before expert systems are in widespread use.
Personal Computers History and Development
The personal computer (PC) has revolutionized business and personal activities and even the way people talk and think; however, its development has been less of a revolution than an evolution and convergence of three critical elements - thought, hardware, and software. Although the PC traces its lineage to the mainframe and minicomputers of the 1950s and 1960s, the conventional thought that was prevalent during the first thirty years of the computer age saw no value in a small computer that could be used by individuals.
A PC is a microcomputer, so named because it is smaller than a minicomputer, which in turn is smaller than a mainframe computer. While early mainframes and their peripheral devices often took up the floor space of a house, minicomputers are about the size of a refrigerator and stove. The microcomputer, whose modern development traces back to the early 1970s, and fits on a desk.
From the start, the creation of the computer was centered around the concept that a single unit would be used to perform complex calculations with greater speed and accuracy than humans could achieve.
 

 
The PC is Born
In 1975, Rubik's Cube was put on store shelves and proved to many that the human brain was incapable of complex problem solving. But a ray of hope also appeared; the first PC was introduced. Micro Instrumentation and Telemetry Systems, Inc. (MITS) sold a kit for the MITS Altair 8800 that enabled computer hobbyists to assemble their own computers. It had no monitor, no keyboard, no printer, and couldn't store data, but the demand for it, like Rubik's Cube, was overwhelming.
The Altair proved that a PC was both possible and popular, but only with those people who would spend hours in their basements with soldering irons and wire strippers. The Altair, which looked like a control panel for a sprinkler system, didn't last, but it helped launch one of the largest companies in the computer world and gave a couple of young software programmers a start. In 1974, Bill Gates and Paul Allen wrote a version of BASIC for the Altair and started a company called Microsoft Corporation.
In 1976, another computer kit was sold to hobbyists - the Apple I. Stephen Wozniak sold his Volkswagen and Steve Jobs sold his programmable calculator to get enough money to start Apple. In 1977, they introduced the Apple II, a pre-assembled PC with a color monitor, sound, and graphics. It was popular, but everyone knew that a serious computer didn't need any of this. The kits were just a hobby and the Apple II was seen as a toy. Even the Apple name wasn't a serious, corporate sounding name like IBM, Digital Equipment Corporation, or Control Data.
But 1977 also brought competition. The Zilog Z-80 microprocessor, which had been introduced in 1975, was used in the Tandy Radio Shack TRS-80, affectionately called the "Trash 80." Apple, Commodore, and Tandy dominated the PC marketplace. The Apple II had 16K bytes of RAM and 16K bytes of ROM; Commodore Business Machines' Personal Electronic Transactor (PET) included 4K RAM and 14K ROM; and the TRS-80 had 4K RAM and 4K ROM.
Also in 1977, the Central Program for Microprocessors (CP/M) operating system was developed by Digital Research and Gary Kildall. From its introduction until 1980, CP/M was used in most PCs, but even that did not guarantee that a program or document could be written on one machine and read on another because each manufacturer used different floppy disk drives.
Apple introduced the floppy disk drive in 1978, allowing Apple II users to store data on something other than the cumbersome and unreliable tape cassettes that had been used up to that point. But despite the popularity of the three PCs, non-computer people still saw little reason to buy an expensive calculator when there were other ways to do the same things. In 1979, that all changed.
When VisiCalc was introduced for the Apple II, non-computer people suddenly saw a reason to buy a computer. VisiCalc, a spreadsheet program created by Dan Bricklin and Bob Frankston, allowed people to change one number in a budget and watch the effect it had on the entire budget. It was something new and valuable that could only be done with a computer. For thousands of people, the toy, the computer few could find a use for, had been transformed into a device that could actually do something worthwhile.
Microprocessors and high-tech gadgets were gradually worming their way into people's lives. In 1978, Sony introduced the Beta format video cassette recorder, and a year later the VHS video recorder and the Sony Walkman. And to remind everyone of how far we had to go, Star Trek: The Motion Picture came to theaters in 1979.
The Sinclair ZX-80 PC, which hit the market in 1980, used the same Z-80 chip as Commodore's PET and the Tandy TRS-80. The ZX-80 had 1K RAM and 4K ROM. Developed by British entrepreneur Clive Sinclair, the ZX-80 meant that people could enter the computer revolution for under $200. Its small size and price attracted people who had never thought about owning a PC.
The Commodore VIC-20, also introduced in 1980, had a color monitor and would eventually become the first PC to sell more than one million units. Even with all of the success the early PC manufacturers had in the late 1970s and early 1980s, the advances in microprocessor speeds, and the creation of software, the PC was still not seen as a serious business tool. Unknown to everyone in the computer industry; however, a huge oak tree was about to drop an acorn that would fall close to the tree and change everything.
   
Out of the Box and Obsolete
For consumers, the late 1980s were a time of frustration. No sooner had they learned to run their new PC and Macs than a new, better, larger, faster model was on the shelf. New versions of software, printers, and modems made it impossible to have the latest of anything.
In 1990, Intel's 386 and Motorola's 68030 microprocessors were at the top, then in 1991 Intel brought out the i486SX 20 MHz chip and Motorola introduced the 68040. Less than a year later Intel introduced the 50MHz 486 chip and Tandy brought out its $400 CD-ROM drive for PCs. Then, just to make everyone wonder what was going on, in 1991 Apple and IBM agreed to share technology by integrating the Mac into IBM's systems and using the IBM Power PC chip.
In 1992, Apple brought out the Apple PowerBook, a laptop that made everyone wonder just how small a full-function computer could get. A year later everyone knew the answer when Apple introduced the Newton Personal Digital Assistant (PDA). The Newton was supposed to be able to recognize hand-written notes and Apple sold 50,000 of them in 10 weeks.
In 1993, Intel introduced the 60MHz Pentium chip, the next generation of chips. The Pentium; however, had a nasty mathematical bug and its acceptance was slowed. Apple discontinued the workhorse of its fleet, the Apple II, which, despite the mind boggling changes in the industry, had lasted 17 years.
Not only were hardware and software obsolete, people were also getting caught up in their own obsolescence. For years, employers had included the operating systems and software names in their advertising for clerical and secretarial positions. As companies used more temporary workers and included both IBM clones and Macintosh's in their operations, proficiency with only one slammed the door on employment opportunities.
Many people enrolled in classes to learn the latest software or update their computer skills. A good, well-rounded employee needed to know desktop publishing, two or more word processing programs, at least one spreadsheet program, and a graphics package. They had to be able to access the company local area network (LAN), send and receive E-mail using high-speed modems, and solve problems with hardware and software to maximize their output. Microprocessor-driven telephones, cellular phones, and pagers added to the complexity of the job, and repetitive motion syndrome from using keyboards hour after hour created an army of people wearing wrist braces.
Many people left a job where their day was spent working at a computer terminal or PC and went home to enjoy the quite, relaxing camaraderie they found in Internet chat rooms, by visiting the World Wide Web, or reading their favorite newspapers and electronic magazines (ezines).
From its inception in 1975, the PC has become a focal point of business, education, and home life. The microprocessor, an amazing technology when it had 4000 transistors on a single chip, is now even more amazing when it has over 3 billion transistors on an even smaller chip. In 1982, when Time magazine made the computer its "Man of the Year," the PC was still in its infancy. "Big Iron" still dominated the high-tech environment and having a personal computer was a luxury.
The creation and success of the PC would not have been possible without the elimination of the concept that a computer was a large, centralized, data processor and number cruncher. Today the PC is a communication channel more than it is a computational tool. Millions of people work in their "electronic cottages," either operating their own business from home or telecommuting to work. It is strange that one of the first Intel 4004 microprocessors ever made, continues to operate and lead the world to the outer edges of time and space. In 1972 one of the small chips was installed in the Pioneer spacecraft. Today it continues to operate over 5 billion miles from earth.

Lumina şi întunericul - realităţi fizice şi metafizice contrastante??

profesor, inginer Teodora Palaghia

articol

Ce a fost la început: oul sau găina? Asemănător acestei întrebări, aş formula: ce a fost prima dată: lumina sau întunericul? Care dintre ele e mai importantă, ce sunt ele, cum sunt explicate, care este sensul lor? În stare pură au un caracter absolut sau alternează şi se desfăşoară gradat pe o scală de nuanţe, sau ambele sunt elemente capitale şi deopotrivă esenţiale pentru echilibrul fizic şi metafizic al vieţii? De ce există subiectivism în concepţia umană de superioritate a luminii în defavoarea întunericului? Sunt ele perfect opozabile? Sunt ele realităţi fizice, metafizice şi mistice pentru evoluţia vieţii? Mai multe in atasament.

articol Dimensiune 187.8 kB (application/pdf)

ALGORITMI DE SORTARE UTILIZÂND METODA DIVIDE ET IMPERA - studiu

prof. Neli Secita

studiu

studiu Dimensiune 375.0 kB (application/pdf)

Actiuni document