Digital substation. Overview of world development trends. Modern trends in the development of radio transmitting equipment
The first electronic computers (computers) appeared a little over 50 years ago. During this time, microelectronics, computer technology and the entire informatics industry have become one of the main components of world scientific and technological progress. Influence computer science to all spheres of human activity continues to expand. Currently, computers are used not only to perform complex calculations, but also in the management of production processes, in education, healthcare, ecology, etc. This is explained by the fact that computers are capable of processing any kind of information: numerical, textual, tabular, graphic, audio, video information.
The first ELILC electronic computer was built in 1946 as part of a research project funded by the US Department of Defense. A year earlier, J. von Neumann published an article that outlined the basic principles of building computers. The project was based on a calculator model developed by an American of Bulgarian origin J. Atanasov, who was engaged in large-scale calculations. Such prominent scientists as K. Shannon, N. Winner, J. von Neumann and others took an active part in the implementation of the project. From that moment the era of computer technology began. With a delay of 10-15 years, domestic computer technology began to develop.
The mathematical foundations of automatic calculations had already been developed by this time (G. Leibniz, J. Buhl, L. Turing, etc.), but the emergence of computers became possible only thanks to the development of electronic technology. Repeated attempts to create various kinds of automatic computing devices (from the simplest abacus to mechanical and electromechanical calculators) did not allow the construction of reliable and cost-effective machines.
The advent of electronic circuits made it possible to build electronic computers.
An electronic computer, or computer, is a complex of hardware and software designed to automate the preparation and solution of user tasks (Fig. 1).
The user is understood as the person in whose interests the data processing is carried out. Customers of computing works, programmers, operators can act as a user. As a rule, the time for preparing tasks is many times greater than the time for solving them.
Computers are universal technical means of automating computational work, that is, they are able to solve any problems related to the transformation of information. However, the preparation of problems for solving on a computer has been and still remains a rather laborious process, requiring users in many cases to have special knowledge and skills.
To reduce the complexity of preparing problems for solving, more efficient use of individual hardware, software and computers as a whole, as well as to facilitate their operation, each computer has a special set of software tools. Typically, hardware and software are interconnected and combined into a single structure.
The structure is a collection of elements and their relationships. Depending on the context, the structures of technical, software, hardware-software and information means are distinguished.
Part of the software provides interaction between users and computers and is a kind of "intermediary" between them. It is called the operating system and is the core of computer software.
By software we mean a complex of software tools for regular use, designed to create the necessary service for users to work.
The software (software) of individual computers and computing systems (CS) can vary greatly in the composition of the programs used, which is determined by the class of computer equipment used, the modes of its use, the content of user computing work, etc. The development of software for modern computers and VS is largely evolutionary and empirical in nature, but patterns in its construction can be distinguished.
Consider the main milestones and trends in the development of computers, their hardware and software (Table 1).
Table 1
Automation of the preparation and solution of problems on a computer |
In the general case, the process of preparing and solving problems on a computer provides for the mandatory implementation of the following sequence of steps:
1) formulation of the problem and mathematical formulation of the problem;
2) choice of method and development of a solution algorithm;
3) programming (algorithm writing) using some algorithmic language;
4) planning and organization of the computing process - the order and sequence of the use of computer and computer resources;
5) the formation of a "machine program", that is, a program that will be directly executed by the computer;
6) the actual solution of the problem - the execution of calculations according to the finished program.
With the development of computing technology, automation of these stages comes from below.
On the way to the development of electronic computing technology, four generations of computers can be distinguished, differing in the element base, functional and logical organization, constructive and technological design, software, technical and operational characteristics, and the degree of access to computers by users. The change of generations was accompanied by a change in the main technical, operational and technical and economic indicators of computers and, first of all, such as speed, memory capacity, reliability and cost. At the same time, one of the main development trends has been and remains the desire to reduce the complexity of preparing programs for the tasks to be solved, to facilitate the connection of operators with machines, and to increase the efficiency of using the latter. This was dictated and is being dictated by the constant increase in the complexity and laboriousness of the tasks, the solution of which is entrusted to computers in various fields of application.
Opportunities for improving technical performance indicators Computers are largely dependent on the elements used to build their electronic circuits. Therefore, when considering the stages of development of computers, each generation is primarily characterized, as a rule, by the element base used.
The main active element of the computers of the first generation was a vacuum tube, the remaining components of electronic equipment are ordinary resistors, capacitors, transformers. For building random access memory already from the middle
Principles of building a computer
In the 1950s, elements specially designed for this purpose began to be used - ferrite cores with a rectangular hysteresis loop. At first, standard telegraph equipment was used as an input-output device (teletypes, tape punchers, transmitters, equipment for counting and punching machines), and then electromechanical storage devices on magnetic tapes, drums, disks and high-speed printers were specially developed.
Computers of this generation were of considerable size and consumed a lot of power. The speed of these machines ranged from several hundred to several thousand operations per second, the memory capacity was several thousand machine words, and the reliability amounted to several hours of operation.
In these computers, only the sixth stage was subject to automation, since there was practically no software. The user had to prepare all five previous stages manually on his own, up to obtaining machine codes for programs. The labor-intensive and routine nature of these works was the source a large number errors in assignments. Therefore, in the computers of the next generations, first elements appeared, and then entire systems that facilitate the process of preparing problems for solution.
The lamps were replaced by transistors in second-generation machines (early 60s). Computers began to have greater speed, RAM capacity, and reliability. All the main characteristics have increased by 1-2 orders of magnitude. Significantly reduced size, weight and power consumption. A great achievement was the use of printed wiring. The reliability of electromechanical input-output devices has increased, the specific weight of which has increased. Machines of the second generation began to have greater computational and logical capabilities.
A feature of second-generation machines is their differentiation in application. Computers appeared to solve scientific, technical and economic problems, to control production processes and various objects (control machines).
Along with the technical improvement of computers, methods and techniques for programming calculations are being developed, the highest step of which is the emergence of programming automation systems that greatly facilitate the work of mathematicians-programmers.
Algorithmic languages, which greatly simplify the process of preparing problems for solution, have received great development and application. With the advent of algorithmic languages, the staff of programmers was sharply reduced, since programming in these languages became within the power of the users themselves.
The widespread use of algorithmic languages (Autocodes, Algol, Fortran, etc.) and their corresponding translators, which make it possible to automatically generate machine programs according to their description in an algorithmic language, led to the creation of libraries standard programs, which made it possible to build machine programs in blocks, using the experience accumulated and acquired by programmers. New software tools here have not yet been combined into separate packages under common control. Note that the time limits for the appearance of all these innovations are quite blurred. Usually their origins can be found already in the depths of computers of previous generations.
The third generation of computers (in the late 60s - early 70s) is characterized by the widespread use of integrated circuits. An integrated circuit is a complete logical and functional unit corresponding to a rather complex transistor circuit. Thanks to the use of integrated circuits, even more
improve the technical and operational characteristics of machines. Computer technology began to have a wide range of devices that make it possible to build a variety of data processing systems focused on various applications. They covered a wide range of performance, which was also facilitated by the widespread use of multilayer printed wiring.
In computers of the third generation, the set of various electromechanical devices for input and output of information has significantly expanded. The development of these devices is evolutionary: their performance improves much more slowly than that of electronic equipment.
A distinctive feature of the development of software tools of this generation is the emergence of pronounced software and the development of its core - operating systems responsible for the organization and management of the computing process. It was here that the concept of "computer" was increasingly replaced by the concept of "computer system", which to a greater extent reflected the complication of both the hardware and software parts of the computer. The cost of software began to rise, and is now far ahead of the cost of hardware (Fig. 2).
Rice. 2. Dynamics of changes in the cost of hardware and software |
The operating system (OS) plans the sequence of distribution and use of computing system resources, and also ensures their coordinated work. Resources are usually understood as those means that are used for calculations: computer time of individual processors or computers included in the system; the amount of RAM and external memory; individual devices, information arrays; program libraries; separate programs for both general and special applications, etc. Interestingly, the most common OS functions in terms of handling emergency situations (program protection from mutual interference, interrupt and priority systems, time service, interface with communication channels, etc.) were fully or partially implemented in hardware. At the same time, more complex modes of operation were implemented: shared access to resources, multiprogram modes. Some of these solutions became a kind of standard and began to be used everywhere in computers of various classes.
Third-generation machines have significantly expanded the possibilities for providing direct access to them from subscribers located at various, including significant (tens and hundreds of kilometers) distances. The convenience of communication between the subscriber and the machine is achieved through a developed network of subscriber points connected to the computer by information communication channels, and the corresponding software.
For example, in the time sharing mode, many subscribers are given the opportunity of simultaneous, direct and operational access to a computer. Due to the large difference in the inertia of a person and a machine, each of the simultaneously working subscribers gets the impression that he alone has been given machine time.
Here, the tendency towards the unification of computers, the creation of machines that are a single system, is even more pronounced. A striking example of this trend is the domestic program for the creation and development of the Unified System of Electronic Computers (ES COMPUTER).
The ES computer was a family (series) of software-compatible machines built on a single element base, on a single constructive and technological basis, with a single structure, unified system software and a single unified set of external devices.
The industrial production of the first models of ES computers was started in 1972, when they were created, all modern achievements in the field of electronic computing, technology and design of computers, in the field of building software systems were used. Combining the knowledge and production capacities of the developing countries made it possible to solve a complex scientific and technical problem in a rather short time. The ES computer was a continuously developing system in which the technical and operational indicators of machines were improved, peripheral equipment was improved and its range was expanded.
For machines of the fourth generation (80s), the use of large integrated circuits (LSI) is typical. A high degree of integration contributed to an increase in the density of the electronic equipment layout, the complexity of its functions, an increase in reliability and speed, and a decrease in cost. This, in turn, had a significant impact on the logical structure of the computer and its software. The connection between the structure of the machine and its software, especially the operating system, has become closer.
In the fourth generation, with the advent of microprocessors in the United States (1971), a new class of computers arose - microcomputers, which were replaced by personal computers(PC, early 80s). In this class of computers, along with LSI, very large-scale integrated circuits (VLSI) of 32-bit and then 64-bit began to be used.
The advent of the PC is the brightest event in the field of computer technology, until recently the most dynamically developing sector of the industry. With their introduction, the solution of the problems of informatization of society was put on a real basis.
The main purpose of using a PC is the formalization of professional knowledge. Here, first of all, the routine part of the work (collection, accumulation, storage and processing of data) is automated, which takes more than 75% of the working time of applied specialists. The use of a PC made it possible to make the work of specialists creative, interesting, and effective. Currently, PCs are used everywhere, in all areas of human activity. New areas of application have also changed the nature of computational work. So, engineering and technical calculations make up no more than 9-15%, to a greater extent, PCs are now used to automate sales, procurement, inventory management, production, to perform financial and economic calculations, office work, gaming tasks, etc.
The use of a PC made it possible to use new information technologies and create distributed data processing systems. The highest stage of distributed data processing systems are computer (computer) networks of various levels - from local to global.
In computers of this generation, the complication of technical and software structures continues (hierarchy of management of means, an increase in their number). It should be pointed out to a noticeable increase in the level of "intelligence" of systems created on their basis. The software of these machines creates a "friendly" environment for communication between a person and a computer. On the one hand, it controls the information processing process, and on the other hand, it creates the necessary service for the user, reducing the complexity of his routine work and giving him the opportunity to pay more attention to creativity.
Similar tendencies will be preserved in the computers of the next generations. So, according to researchers, the machines of the next century will have “artificial intelligence” built into them, which will allow users to access machines (systems) in natural language, enter and process texts, documents, illustrations, create knowledge processing systems, etc. All this leads to the need to complicate the hardware of computers, the emergence of computing systems based on them, as well as to the development of complex multi-tier hierarchical software for data processing systems.
In the past century, many discoveries and inventions have been made that have played a revolutionary role in the development of modern civilization.
creation and development of means of communication, especially wireless.
The invention of cinema.
The emergence and development of aviation and space technology. Modern aircrafts in terms of their technical and design characteristics, they are not comparable with the first aircraft.
But the most dramatic progress has been in the field of computing. (about 50 years ago, the first computers had a weight of about 30 tons, an area of approx. 200m 2)
the computation time was measured in hours or days.
Now the computer can be placed on a silicon crystal S = 5mm 2 , the calculation time is microseconds, and they cost little.
At the same time, unlike the first computers, which program in mathematical codes and were able to perform mainly only cumbersome mathematical calculations, modern computers are able to prove theorems, translate text, and reproduce moving objects.
The appearance of the first machine for performing four arithmetic operations is subsidized by the beginning of the 17th century. (1623 V. Shikard invented a mechanical machine for adding, subtracting, partially multiplying and dividing), but the desktop adding machine (1642) from the French turned out to be more famous. scientist Pascal. 1671 Leibniz invented the so-called. Leibniz gear wheel, which allows you to perform 4 arithmetic operations.
In the 19th century exacerbated the need to perform calculations associated with processing the results of astronomical observations, calculations associated with the compilation of mathematical tables. Therefore, in 1823 English. mathematician Charles Babbage began to develop an automated difference engine powered by a steam engine.
The machine was supposed to calculate the values of polynomials and print the results on a negative for photo printing, but the technical means that existed at that time did not make it possible to complete the implementation of this idea, and in addition, Babbage himself became interested in designing a more powerful calculating machine. Babbage's new calculating machine was called "analytical".
In 1894, he outlined its basic principles, which were embodied in the loom of the program with punched card control by the Frenchman Jacquard.
The analytical engine was one of the first programmable automatic computers with sequential control. She had an arithmetic device and memory.
The patron of the project was Countess Ada Augusta Lovelace, the first female programmer. The Ada programming language is named after her.
At the end of the 19th century Hollerith developed a punch card machine capable of automatically classifying and tabulating data. It was used in 1890 in America, the census was conducted on it. The program was read from a punched card using electric contact brushes. As digital meters - em relay.
1896 Horrelit founded the forerunner of IBM.
There has been no noticeable progress since Babbage's death.
speed calculation mech. or electromech. machines was limited, so in the 30s. 20 in the development of electronic computers (computers). Based on vacuum 3-electrode tubes (triodes), which were invented in 1906 by Lead Frest.
The first universal computer "Eniak" was developed at the US Pennsylvania Institute (1940-1946) - the development of numerical tables for calculating the flight path of objects. (18 thousand electronic boards, 140 kW, 10th SS, programmed manually using switches.
Modern trends in the development of computer technology.
Currently, the world is undergoing a transition from an industrial society to an information society. If the main content of an industrial society was the production and consumption of mat. goods, the driving force of the information society is the creation and consumption of information resources of various types and purposes. At the same time, the achievement of economic and social results is determined not so much and not so much by the availability of material and energy resources, but by the scale and pace of informatization of society and the widespread use of information technologies in all spheres of human activity.
Independence from the differences and features of information processes in various areas of public life, they are characterized by the presence of 3 components:
identity (uniformity) of the main means of production (means of computing technology and informatics)
identity of the "raw material" (initial data to be analyzed and processed)
Identity of manufactured products (“processed” information)
The key role in the information infrastructure belongs to the system telecommunications, as well as calculus. systems and their networks.
These areas are concentrated the latest tools calc. technology, informatics and communications, as well as the most advanced information technologies.
In the past history of the development of computer technology (which began in the 40s of the 20th century), 4 generations of computers can be distinguished, differing among themselves in the element base, functional logical organization, constructive-tech. execution, software, those and operational characteristics, modes of use.
The change of generations was accompanied by a change in technical exploitation and technical
economic indicators of computers.
First of all it is:
speed, memory capacity, reliability, cost.
At the same time, this was accompanied by a trend of improving software and increasing the efficiency of using and accessing it.
Currently, work is underway on the creation of computers of the 5th generation, which brought the creation of the AI closer to reality.
Classification of euTechniques
To date, millions of computers of various types, classes and levels have already been produced in the world and are being created again.
EWT is usually divided into analog and digital.
In AVM, information is represented by the corresponding values of certain analogues (continuous physical quantities) - current, voltage, rotation angle, etc.
AVMs provide acceptable performance, but a moderate calculation accuracy of approx. 10 -2 -10 -3
AVMs have a rather limited distribution and are used mainly in research institutes and design organizations in the development of research and improvement of the trail. samples of technology, i.e. AVM belong to the field of specialized computers.
Digital computers are more widely used, in which information is displayed using digital or binary codes.
The rapid pace of development and change of digital computer models makes it difficult to use any of their standard classification.
Academician Gluzhkov noted that there are 3 global spheres that require the use of qualitatively different types of computers, as well as .:
traditional use of computers for automated computing
the use of computers in various control systems (since the 60s - the sphere to the greatest extent involves the use of a computer line)
Machines of this profile must meet the following. requirements:
cheaper than large centralized computers.
more reliable, especially when working directly in the control loop.
have greater flexibility and adaptability to working conditions
was architecturally transparent, i.e. the structure and functions of the computer should be understandable to the general user.
3. To solve problems of artificial intelligence.
The computer market has a wide range of classes and models of computers. For example, IBM, which produces approximately 80% of the world's machine park, mainly produces 4 classes of computers:
mainframe computers (main frame) - multi-user machines with centralized processing of information and various forms of remote access. According to IBM experts, approx. 50% of the total amount of data in the information systems of the world must be stored in large machines. Their new generation is intended for use in networks as large servers.
The development of computers of this class is of great importance for the Russian Federation, as well. we have a huge backlog for the ES computer program that borrowed the IBM 360 / 310 architecture, so it was decided to continue the development of this direction and in 1993 an agreement was signed with IBM, according to which the Russian Federation received the right to produce 23 types of the latest models - analogues of IBM with performance from 1.5 to 167 million operations per second.
CarsRS/ 6000 , which have high performance and are designed to build the work of stations, to work with graphics, for UNIX servers and cluster complexes for scientific research.
Medium computers primarily for work in financial structures (business computers). They pay special attention to the preservation and security of data, as well as program compatibility. These machines are used as LAN servers.
Microprocessor based computersIntel
Computing systems using parallel operation.
You can use the following. classification of computer facilities based on their division according to speed of action:
super computer, for solving complex computational problems and for servicing the largest information databanks
mainframes, for departments, territorial and regional computing centers.
medium computers, for process control systems (ACS technological process) and automated control systems (production), as well as for managing distributed information processing as servers.
personal and professional computers AWPs (automated workstations) for specialists of various profiles are formed on their basis.
embedded microprocessors (microcomputers) for automated control of individual devices and mechanisms.
RF is in need of:
Supercomputer ~ 100-200 pcs.
Large computers ~ 1000 pcs.
Medium computers ~ 10 4 -10 5 pcs
Send your good work in the knowledge base is simple. Use the form below
Students, graduate students, young scientists who use the knowledge base in their studies and work will be very grateful to you.
Modern trends in the development of radio transmitting equipment
Radio transmitting devices (RPDU) are used in the fields of telecommunications, television and radio broadcasting, radar, radio navigation. The rapid development of microelectronics, analog and digital microcircuitry, microprocessor and computer technology has a significant impact on the development of radio transmitting technology, both in terms of a sharp increase in functionality and in terms of improving its performance. This is achieved through the use of new principles for constructing structural diagrams of transmitters and circuit implementation of their individual nodes that implement digital methods for generating, processing and converting oscillations and signals having different frequencies and power levels.
Radio transmitters that use digital methods for generating, processing and converting oscillations and signals will be referred to as digital radio transmitting devices (TsRPdU).
Consider modern requirements to RPDU, which pose problems that cannot be solved in principle by methods of analog circuitry, which necessitates the use of digital technologies in RPDU.
In the field of telecommunications and broadcasting, the following main continuously increasing requirements for information transmission systems, the elements of which are RPMS, can be distinguished:
Ensuring noise immunity in an overloaded radio air;
Increasing the throughput of channels;
Profitability of using the frequency resource in multi-channel communication;
Improved signal quality and electromagnetic compatibility.
The desire to meet these requirements leads to the emergence of new communication and broadcasting standards. Among the already known GSM, DECT, SmarTrunk II, TETRA, DRM, etc.
The main direction of development communication systems is to provide multiple access, in which the frequency resource is shared and simultaneously used by several subscribers. Multiple access technologies include TDMA, FDMA, CDMA, and combinations thereof. At the same time, the requirements for the quality of communication are also increased, i.e. noise immunity, the amount of information transmitted, the security of information and user identification, etc. This leads to the need to use complex types of modulation, information coding, continuous and fast tuning of the operating frequency, synchronization of the operation cycles of the transmitter, receiver and base station, as well as ensuring high frequency stability and high accuracy of amplitude and phase modulation at operating frequencies measured in gigahertz. Concerning broadcasting systems, here the main requirement is to improve the quality of the signal on the subscriber's side, which again leads to an increase in the amount of transmitted information due to the transition to digital broadcasting standards. The time stability of the parameters of such radio transmitters - frequency, modulation - is also extremely important. It is obvious that analog circuitry is not able to cope with such tasks, and signal generation transmitters must be carried out by digital methods.
Modern radio transmitting technology cannot be imagined without built-in software tools. mode control operation of cascades, self-diagnostics, auto-calibration, auto-regulation and protection against emergencies, including automatic redundancy. Such functions in transmitters are carried out by specialized microcontrollers, sometimes combining the functions of digital formation of transmitted signals. Often used remote control of operating modes using a remote computer through a special digital interface. Any modern transmitter or transceiver provides a certain level service for user, which includes digital control of the transmitter (for example, from the keyboard) and indication of operating modes in graphical and textual form on the display screen. It is obvious that one cannot do without microprocessor control systems for the transmitter, which determine its most important parameters.
The production of transmitters of this level of complexity would be economically unprofitable in the case of their analog design. It is the means of digital microcircuitry, which allow replacing entire blocks of conventional transmitters, that make it possible to significantly improve overall dimensions transmitters (remember Cell Phones), achieve repeatability of parameters, high manufacturability and simplicity in their manufacture and adjustment.
Obviously, the emergence and development of digital radio transmitting devices was an inevitable and necessary stage in the history of radio engineering and telecommunications, allowing to solve many urgent problems that are inaccessible to analog circuitry.
As an example, consider a broadcast digital radio transmitter HARRISPLATINUMZ(Fig. 1.1), which has the following main features (information on www.pirs.ru):
A) Fully digital HARRIS DIGITTM FM exciter with built-in DSP stereo oscillator. As the world's first all-digital FM exciter, HARRIS DIGITTM accepts AES/EBU audio frequencies digitally and generates the most fully digitally modulated RF carrier frequency, resulting in less noise and distortion than any other FM transmitter ( 16-bit digital AF quality).
B) The quick start system ensures that full power is reached in all respects within 5 seconds after switching on.
C) The controller on microprocessors allows full control, diagnostics and display. Includes built-in logic and commands for switching between main/additional HARRIS DIGITTM exciters and power preamplifier (PPA).
D) The broadband scheme allows you to refuse tuning in the range from 87 to 108 MHz (with the N + 1 option). Frequency changes can be made manually with switches in less than 5 minutes, and in less than 0.5 seconds with an optional external controller.
Fig.1.1
Another example of a digital radio transmitter is a device for wireless data transmission. BLUETOOTH(information www.webmarket.ru), which will be discussed in more detail in paragraph 3.1 (Fig. 1.2 and Table 1.1).
Fig.1.2.
Table 1.1. Bluetooth Brief Specifications
So, let's highlight the main areas of application of digital technologies for generating and processing signals in radio transmitting devices.
1. Formation and conversion of analog and digital information low-frequency signals, incl. pairing a computer with a radio transmitter (group signals, encoding, converting analog signals to digital or vice versa).
2. Digital methods of modulation of RF signals.
3. Frequency synthesis and frequency control.
4. Digital transfer of the spectrum of signals.
5. Digital methods for amplifying the power of RF signals.
6. Digital systems for automatic regulation and control of transmitters, indication and control.
The following sections provide more detailed information about each of these digital applications in radio transmitters.
Bibliography
1. Digital radio receiving systems / Ed. M.I. Zhodzishsky. Moscow: Radio and communication, 1990. 208 p.
2. Improving the Efficiency of Powerful Radio Transmitting Devices / Ed. A.D. Artyma. Moscow: Radio and communication, 1987. 175 p.
3. Goldenberg L.M., Matyushkin B.D., Polyak M.N. Digital signal processing: Proc. allowance for universities. M.: Radio and communication, 1990. 256 p.
4. Semenov B.Yu. Modern tuner with your own hands. M.: SOLON_R. 2001. 352 p.
Similar Documents
The history of the development and formation of radio transmitting devices, the main problems in their work. Generalized block diagram of a modern radio transmitter. Classification of radio transmitters according to various criteria, frequency range as one of the characteristics of devices.
abstract, added 04/29/2011
General information about Bluetooth, what it is. Connection types, data transfer, packet structure. Features of Bluetooth operation, description of its protocols, security level. Profile configuration, description of the main competitors. Bluetooth specifications.
control work, added 12/01/2010
Characteristics of radio transmitting devices, their main functions: generation electromagnetic oscillations and their modulation in accordance with the transmitted message. Design functional diagram radio transmitter and determination of some of its parameters.
abstract, added 04/26/2012
What is a TSR? The principle of building trunking networks. Tracking network services. Bluetooth technology - as a method of wireless transmission of information. Some aspects of the practical application of Bluetooth technology. Analysis of wireless technologies.
term paper, added 12/24/2006
Tasks of using analog-to-digital converters in radio transmitters. Features of digital-to-analog converters (DAC) for operation in low-frequency paths, control systems and specialized high-speed DACs with high resolution.
term paper, added 01/15/2011
The main characteristics of the video. Video standards. Recording formats. Compression methods. Modern mobile video formats. Programs required to play the video. Modern video cameras. Digital video media. Satellite television.
abstract, added 01/25/2007
What is Bluetooth? Existing Methods solving individual problems. "Frequency conflict". Competitors. A practical example of a solution. Bluetooth for mobile communications. Bluetooth devices. December boom. Who makes bluetooth chips? Harold Blue Tooth.
abstract, added 11/28/2005
Calculation of the transmitter and matching circuit. Calculation of the block diagram and cascade of the radio transmitter, the values of the elements and energy indicators of the quartz self-oscillator. Instability of a quartz self-oscillator and design of radio transmitting devices.
term paper, added 12/03/2010
Modern types of telecommunications. Description of systems for the transmission of continuous messages, sound broadcasting, telegraph communication. Features of the use of twisted pair, cable lines, optical fiber. The purpose of Bluetooth technology and trunking.
abstract, added 10/23/2014
The main trends in the development of the Earth remote sensing data market in the last decade. Modern space remote sensing high resolution. Ultra high resolution satellites. Perspective cartographic complexes Cartosat-1 and Cartosat-2.
Modern DFS should be universal, accept all kinds of data from various devices for processing, providing a wide range of output products for cartography, GIS, 3D modeling systems. An important characteristic of the CFS is the operational support of new types of sensors, primarily space
A.Yu. Sechin(CJSC "Rakurs")
The development of digital photogrammetry is primarily determined by the level of technological development. The speed of modern computers allows you to quickly solve tasks that once required a significant investment of time. Sensors of remote sensing systems are being improved, new digital cameras, instruments and devices are appearing, and the characteristics of existing ones are improving. The possible number of images in blocks for joint adjustment is increased. The requirements for the output products of digital photogrammetric stations (DFS) are growing, more and more often users request not only traditional orthophotomaps and vector data for GIS, but also full-fledged three-dimensional models as a result of remote sensing data processing. In the author's opinion, modern DFS should be universal, accept all kinds of data from various devices for processing, providing a wide range of output products for cartography, GIS, and 3D modeling systems. An important characteristic of the CFS is the operational support of new types of sensors, primarily space ones.
AT last years the desire to use digital aerial cameras, which make it possible to obtain digital images directly in flight, instead of film ones, became clearly noticeable. The stages of developing and scanning films will soon become a thing of the past. In aerial photography, both conventional frame systems are used (for example, DMC by Intergraph Corp. (USA) or UltraCamX by Vexcel Imaging (USA), which is part of Microsoft Corporation), and sensors based on CCD arrays (for example, ADS-40 by Lieca Geosystems, Switzerland), which have a frame geometry and a mathematical model that are unusual for photogrammetrists. Modern digital cameras have a large color depth (more than 8 bits per channel), the number of simultaneously recorded channels increases, infrared (near and far zones) channels are added to the traditional red, blue, green. Great depth colors allow you to distinguish details that were previously inaccessible to perception (for example, in the shadows). A modern DFS must support an arbitrary number of channels with any color depth at the input, output, and during image processing. When working with satellite sensor data, the CFS should be able to process images both by generalized methods (the sensor model is absent or known in a rough approximation) and taking into account the accompanying metadata, and if a rigorous model is available, use it for accurate processing.
Rice. 1. Modern digital cameras |
Photogrammetric processing of images implies the highest possible, subpixel, measurement accuracy. Therefore, raster data entering the DFS should not be processed that reduces their accuracy. Let's assume a minimal set of raster data preprocessing algorithms, for example, pansharpening. The output raster data (orthophoto) can be subjected to various post-processing techniques to improve visual properties. The presence in the DFS of post-processing modules that preserve the georeferencing of images is an undoubted advantage of the photogrammetric system.
In aerial photography from an aircraft, in addition to digital cameras, integral navigation systems of the GPS / IMU system are increasingly used, which allow measuring the elements of the external orientation of images in flight, as well as laser scanners, which provide the formation of a terrain model without stereo processing of images. The accuracy of such devices is constantly increasing. Currently, if you have a GPS / IMU system on board and data on the terrain obtained using laser scanning technology, you can build orthophotomaps with an accuracy of 2xGSD (GSD Ground Sample Distance pixel size on the ground, determines the shooting parameters of a digital camera, similarity to the scale of an aerosol for analog cameras) and better without traditional adjustment of aerial photographs and relief construction by photogrammetric methods.
If in order to achieve maximum accuracy when processing a block of images, its adjustment is required, modern DFS increasingly uses methods of automatic measurement of tie points, the results of which, as a rule, require subsequent control by the operator. In the near future, we can expect the emergence of more reliable algorithms for automatic placement of points and their rejection during adjustment, which do not require human intervention.
If the methods for constructing digital elevation models in the new generations of DFS are automated and require only the simplest filtering operations and, sometimes, drawing additional orographic lines on the part of the operator, then the process of vectorization of buildings, roads, sections, etc. is still performed manually. Work on its automation has been going on for a long time, the author hopes that in the coming years reliable systems will appear that will facilitate this hard work.
From the point of view of calculations, the most time-consuming process in the DFS is the construction of orthomosaics. For large (several thousand images) blocks, the time required for orthophoto transformation on one computer can be tens or hundreds of hours. With the development of multiprocessor computer systems and fast local area networks, the process of orthophoto-transformation can be distributed over local network computers and processors (cores) of computers. Good scalability and the possibility of parallel processing of significant amounts of data in a local network are the hallmarks of a modern DFS. With the increase in the size of processed blocks and data volumes, the role of centralized data storage servers increases. Perhaps in the near future there will be systems with the possibility of distributed storage of images and related information, providing automatic optimal allocation of storage resources.
Of course, a modern DFS should "understand" a wide range of raster, vector and other data in different formats. At the same time, the output results of photogrammetric processing should be available in formats accepted by various GIS and cartographic systems. Recently, there has been a trend towards the use and visualization of three-dimensional data obtained with the help of DFS, primarily for urban areas. This kind of data is of interest to municipal services, telecommunications companies, departments of the Ministry of Emergency Situations, the military and developers of navigation systems, in the future they can be used to build realistic three-dimensional models of cities.
Note that 3D models are also needed to build so-called “true” orthophotos in CFS, which, despite the high labor intensity of manufacturing and computational complexity, are becoming more widespread.
An important characteristic of DFS is the support of modern hardware stereo visualization. In the first photogrammetric stations for stereo observations, optical-mechanical devices (special monitor attachments) or anaglyph glasses were used. Subsequently, systems appeared that displayed images on the monitor through a line (interlace) and implied the use of special polarized glasses. These systems are sometimes still used, although they are characterized by low accuracy, a narrowed field of view, and poor image quality. As a rule, anaglyphic and interlaced methods cause increased eye fatigue for operators, and, in our opinion, they can only be used to demonstrate the capabilities of DFS and to initially teach how to work with the software system. Modern ways of outputting stereo images are based on professional video cards that support stereo mode in hardware and the OpenGL programming interface (API). In this case, various stereo devices can be used: special monitors based on 2xLCD screens and polarizing glass, stereo projectors. Support for new hardware solutions for stereo output in this case does not require DFS adaptation.
Traditionally, on analytical instruments, special handwheels were used to move the stereo marker. Operators who are just mastering the DFS find control with the help of such handwheels inconvenient and prefer multi-button manipulators of the "mouse" type, specially designed for working in stereo mode. For efficient operation, it is desirable that the CFS supports both handwheels and special manipulators.
Somewhat apart from traditional photogrammetric systems are systems for processing radar images. With the advent of high-resolution space-based sensors (TerraSAR-X, COSMO-Skymed, RADARSAT-2) on the market, the role of the latter has increased significantly. These systems, called radargrammetric systems, make it possible to build digital terrain models with height accuracy within a few meters, create orthoimages (including using terrain models obtained from radar images), as well as high-precision maps of earth surface displacements (with millimeter accuracy with interferometric processing).
Summing up, it can be noted that a modern DFS should “understand” the maximum possible number of raster, vector and other data formats, provide a high level of automation and productivity, and support modern computer technologies. The presence in the DFS of modules for pre- and post-processing of images, tools for working with 3D models obtained by photogrammetric methods should become an integral part of such systems.
Serious progress has been made in modern radio reception technology, due to the intensive introduction of digital microcircuitry. Available microcircuits make it possible to develop receivers with high sensitivity, better image selectivity, lower frequency and non-linear distortions, and also allow solving a number of problems in new ways. In particular, signal microprocessors provide optimal reception quality under interference conditions, auto-search control, electronic memory of dozens of radio stations, program switching, timer operation that turns the receiver on and off according to a given program. Digital and survey settings are used.
For remote control of receivers within the same room, ultrasonic and infrared communication lines are used. The control signals from the remote control are sent to the encoder, in which a sequence of pulses is generated, which is fed to the photodiode, where the PCM of infrared radiation is carried out. The modulated radiation is fed to the receiver (phototransistor), then to the amplifier and decoder, and finally to the control device.
The use of digital broadcasting systems promises undoubted advantages. The digital sound transmission system has long been used in satellite communications and satellite broadcasting channels, and is also used for digital sound recording of musical compositions.
Digital broadcasting provides undistorted sound reproduction: reproducible frequency band 5-20,000 Hz, non-linear distortion coefficient less than 90 dB, almost complete absence of external interference, and also allows for stereo broadcasting. The disadvantage of digital broadcasting is the wide frequency band of about 8 MHz occupied by one radio station, which determines the carrier frequency ranges of digital broadcasting. Digital broadcasting makes it easy to implement information output on the display, repeat mode, message storage, etc.
A simplified block diagram of a modern digital receiver is shown in fig. 7.20. In this scheme, the amplifying path (AT) is made on analog elements and performs preliminary frequency filtering of the received signal, amplification and conversion of its frequency.
Rice. 7.20.
ADC converts analog signal into a digital code that is fed to the actual digital receiver. The latter is a signal processor (SP) that digitally processes the received signal according to a given algorithm. Such an algorithm includes the task of searching for a signal over the range, additional frequency conversion, filtering, detection, etc. If a signal is needed in analog form, then a DAC is introduced at the output of the receiver. The receiver is tuned by channels using a frequency synthesizer (MF).
Now more and more attention is being paid to the use of human voice control and notification systems in household radio equipment. Operator commands are confirmed by a synthesized human voice. The control signal is digitized and fed into the control microprocessor.
Voice recognition systems will become part of the receivers that will carry out the commands of a specific person. After executing the command, the microprocessor generates a response signal, which enters the human speech synthesizer, and the loudspeaker reproduces the response.