Digital transformation: complete freedom of expression

Colleagues, do you well understand what the digital economy is and how it differs from what it was ten or twenty years ago? It seemed to me that I understand until the time appeared to analyze what they write about this on the vastness of the sovereign Internet in the professional and not very media. It turned out that the main IT product on the market is “digital transformation”. It is actively sold wholesale or retail, that is, in parts at will: to whom "big data", to whom "business analysis", to whom "artificial intelligence".



They write a lot according to the sales volume. I analyzed and added questions:



image alt
  • They write a lot, but what is the professional level of these texts? Does he satisfy you?
  • There are many new words, but who understands them?
  • And, in general, is there progress or not? Where is the result?
  • If not, then “who is to blame and what to do”?
  • Where does the state look and what does it regulate?
  • There are many GOSTs, but who uses them? And who writes them? And in general, what is the use of them?
  • Isn't it time, if not to define the basic concepts precisely, then at least come to a consensus?


I even know how: "practice is the criterion of truth."



Foreword



While I am in search, there is time to look back at the past 30 years of work in the IT industry and evaluate what we, that is, our generation, have gained during this time, what has yet to be gained. And then COVID19 turned up, and made it possible not only to read, but also to comprehend where we are going and how? It turned out that unexpectedly, as always, a new era crept up - digital transformation. But what is it? On the market, packaged programs have been replaced by new products - technologies with exciting names: “big data”, “business analysis”, “artificial intelligence”, etc. Accordingly, there are a lot of publications, but their meaning is hidden behind prosaic advertising or general, lush phrases about the progress of mankind.



In the meantime, he noted with regret that during the specified period the professional level of texts on IT-related topics decreased, and dropped significantly! I'm not talking about advertising and comments on the forums - this is a separate conversation. But even on the websites of reputable IT firms, you can read, as it were, so to speak, so as not to offend anyone, ambiguous messages. And if from the literary side it is still tolerant (let's make allowances for the fact that the "physicists" have long won a complete victory over the "lyricists"), then understanding the new key concepts of IT becomes unacceptable.



The eternal Russian question: "who is to blame and what to do"?



Of course, freedom of expression on the Internet is an achievement of our time. And now I use it, expressing my thoughts. But how limitless can it be? In the technological field, it is at least limited by the appropriate terminology, which either accurately denotes concepts or expresses public consensus. Without this, the development of technology is impossible. And I want to sculpt further.



Colleagues, I suggest on Habré, and where else, if not on Habré, to purposefully discuss the problem of new IT terminology. Let's remember that "in the beginning was the word." And we will try to agree on a common understanding of at least the basic concepts of the era of digital transformation. So:



  1. “In the beginning was the word,” a word about concepts.
  2. On state regulation.
  3. About standards.
  4. A summary to continue.


"In the beginning was the word", a word about concepts



Concepts are not just words for conversation. From the course of philosophy - this is a form of thinking that reflects the most general, essential and necessary properties, signs, qualities of real things and phenomena. Concepts are the material that serves as the basis for any thought process: judgments and inferences that enable us to fully reveal the most important aspects, connections and patterns of reality. Order in concepts gives us order in reasoning, and vice versa, disorder in concepts - disorder in reasoning. Concepts in narrow disciplines like IT are terms. And, if specialists-authors have different understandings of the basic terminology, then it is difficult to expect agreement on the issues discussed, for example, in the process of developing information systems.



The last thirty years have been the most turbulent period in the development of information systems, covering three stages of their evolution: automated data banks, automation of business processes, and, finally, digital transformation. And, if everything is clear with databanks for a long time, everything is more or less clear with business processes, then with the concept of "digital transformation" there are still many different interpretations and uncertainties, sometimes on the verge of not understanding what is happening, and sometimes beyond. The new era (I think that “digital transformation” is precisely the era) generates new concepts with unsettled and ambiguous meanings. For example, such as "Big Data", "Business Intelligence" (BI), "Intelligent System" (IS) or "Artificial Intelligence" (AI) and others.



When evaluating project documents of information systems, first of all I look at the presence of a section of the type "basic concepts", in which definitions of important terms are given. By the content of this section, I judge the professionalism of the authors and the content of the project. At least in my projects, I always put "Terms and definitions" as the first section and use it as an introduction to the topic of the project. Therefore, the presentation in it is not in alphabetical order, but in logical order. In complex projects, this section has to be divided into subsections. First of all, this concerns the document "terms of reference", which should be understandable for subject specialists and IT specialists. Then any reader, whether it is a customer or a performer, will already at the very beginning know what it is about and how to understand the content of the project.Therefore, for me, the current terminological confusion is an unpleasant and annoying circumstance, especially if the customer is carried away by new trends and before the meeting googled or nayandeksil something (which is more patriotic).



Quite recently, at the dawn of my information technology youth, the problem was solved simply - under the auspices of an authoritative institution, for example, the Academy of Sciences of the USSR, a dictionary was published that quite clearly defined the meanings of terms. Everyone has

used them, starting from the student's bench for decades. Modern technologies allow almost any Internet user to participate in real time in the creation of information content on almost any topic, including information technology. The intellectual level of this content, to put it mildly, wants to be better. Serious specialized online publications or contain their qualified correspondents (such as CNews , ComNews , tAdviser), or they are trying to introduce some restrictions and expertise of texts (such as our Habr), but, obviously, this does not always work, if only because everyone has the opportunity to start a blog on the side, for example, on Yandex-Zen, and trumpet your opinion on the whole universe. And I'm not talking about forums on websites

- this is a separate song.



Media journalists are adding fuel to the fire. They admire or frighten the layman with artificial intelligence and robots, at best, in the style of Kir Bulychev or Stanislav Lem, and usually they use a moldy set of near-literary, near-technology and near-scientific stamps. But it makes no sense to blame amateurs for this, because the specialists themselves cannot always accurately formulate and explain the mentioned concepts. The problem is that since the end of the USSR, "programming" has stepped into the masses - by the decree of the Central Committee of the CPSU in 1985, the government took up the "second literacy" of the population. Since then, everyone from schoolchildren to academicians has been programming in our country. And vendors of development tools advertise how easy and simple it is to do this with their products.



This is on the one hand. And on the other hand, now in the IT community, the topic is being discussed in all seriousness: does a programmer need to have a higher education? It got to the point that the developers stopped designing normally. The project documentation is either absent, or written just for show in the certificate of completion: "We are programmers, not writers!" This lowers the well-known "level of social responsibility" of programmers, and as an excuse, they refer to the newfangled method of Agile with Scrum, they say, "a working product is more important than comprehensive documentation."



Of course, lowering the "level of social responsibility" is a global trend in the development of creative personalities, and not only in IT, but also, for example, in politics, where the principle of "high-like" often dominates, but in technical projects this trend is alarming, and even simply unacceptable ... Imagine a project to develop an airliner or a spacecraft for a flight to Mars without "exhaustive documentation" ... It's impossible to imagine! But in politics and in IT it turns out to be quite possible.



The trend is alarming, but the education of IT professionals and the design of information systems are topics of a separate conversation. I hope that everything will work out, time will put everything in its place. The only question is when?



All scientific and technological revolutions were accompanied by information riots. Imagine the agony of the genius of Homo erectus - our ancestor - a million years ago trying to explain the invention of the stone chopper to his fellow tribesmen. Then the spread of this new technology took several hundred thousand years, and, obviously, for this, the same genius had to significantly enlarge his brain (he simply did not have other computers) and invent human speech, i.e. develop a revolutionary information technology. Otherwise, it was only possible to prove oneself in the right by a massacre, which was recorded on the remains of the skulls of

our ancestors.



A million years passed unnoticed and cybernetics and even digital transformation appeared. Now Homo sapiens - this is you and me - with its enlarged brain does not split anyone's head (however, it still happens), but cuts, using the same semantic confusion. Sawing colossal budgets that are allocated by corporations and states to clarify the meaning of words incomprehensible to officials. I liked one comment to a wonderful post by Eugene Kaspersky on this topic, “ The artificial intelligence bubble that will bury cybersecurity ”:



“Konata Izumi Hipes, dummies



, cuts, etc. are a normal evolutionary process of the industry's formation. There are fantastic amounts. But the predator in the form of the police also does not sleep, and as a result, in

only the one who does something useful and does not cheat remains alive. It is impossible to order the development of the industry immediately without fraudsters. "



In general, you can agree, but when you read about the exorbitant amounts required for the development of something, you want to take part in it, and the entrance is limited by bureaucratic, corrupt, and even political fences. And if you do not participate, then let those who participate, at least inform the public on what, in fact, our money is spent? It is “ours”, because even if Sberbank is spending it, it is still our money, no matter what the shareholders think. And while we are crucifying here, figuring out what this "bubble of artificial intelligence" is, Sberbank explained to the statehow to spend 120 billion rubles until 2024 on the federal project "Artificial Intelligence", which should become the seventh project of the national program "Digital Economy". But that was in 2019. This year, the project was overestimated to 244 billion rubles. Technological progress is evident!



Well, Sberbank needs this money at least in order to do something with its telephone voice assistant. I remember my shock from the first meeting with the power of this intellect. It's unforgettable! But I don’t know how much money should be spent in order not to drive bank depositors to madness with this artificial intelligence (AI). And I don’t just don’t know, but I can’t even imagine how many lines of program it would take to make it cost billions of rubles. But the cost of development is a value that is quite amenable to calculation and fairly accurately, I know this well from my own experience.



Of course, you don't have to look for a black cat in the Russian dark room - you can search for it in the world dark room. The costs of developing only AI all over the world are growing by leaps and bounds. According to the analysis of the world IT market conducted byThe Ministry of Telecom and Mass Communications in the roadmap for the development of "end-to-end" digital technology "Neurotechnologies and Artificial Intelligence", in 2019 these costs amounted to $ 29.2 billion (1985.6 billion rubles), and in 2024 will amount to $ 137.2 billion (in rubles is better do not recount is "big data"). I think that IT people have a lot to turn around, and that's good. It's bad that you can only join this money by being included in the list of selected firms. Just do not lament about the cuts - only those who do not participate in them, but dream of participating in them, make noise about them.



Meanwhile, "artificial intelligence" is massively being introduced into the regulatory documents of regional and federal institutions at all levels. Try to make a request on any website of regulatory documents and get hundreds and thousands of links to documents from different regions of the Russian Federation, especially on the "Digital Educational Environment" project. But the descriptions of the results from the use of AI are somehow not enough, and no one brags about them.



This is how the semantic problem of IT becomes financial and political.



About state regulation



Naturally, the leading states are trying to regulate such ambiguous processes in the rapidly developing sector of the economy. Against this background, the RF leadership looks good. At least, all documents of state innovations in the IT sphere, which I had a chance to analyze, are made professionally and can serve as an example for a private initiative. Moreover, many IT projects have already been implemented and personally cause me, if not admiration, then positive emotions for sure. An example of the implementation of IT innovations in practice is the digitalization of the Federal Tax Service (FTS), carried out under the leadership of Mikhail Mishustin. A powerful technological base and "adaptive platform" of tax administration has been created, which in real time works exclusively with digital data sources and with digital identities of taxpayers.I recommend to read more about this on the portalTAdviser or on the website of the Ministry of Telecom and Mass Communications .



Right now, the Government of the Russian Federation and various state structures are undergoing global automation (nowadays it is fashionable to say “digitalization”) of the entire system of public administration within the framework of the national program “ Digital Economy of the Russian Federation ”. The program is designed for the period until the end of 2024. It includes 6 federal projects:



  1. Digital environment regulation
  2. Information Security
  3. Information infrastructure
  4. Digital technologies
  5. Human resources for the digital economy
  6. Digital government


Interestingly on our topic, the authors of the documents of these projects are very careful in using the latest terminology. If we are talking about large amounts of data (and in these FTS systems, really huge streams of data in real time from a huge number of users are processed), then it means large amounts of data, and not special special technologies. “Intelligence” also occurs in the context of its prosaic understanding, for example, as “intellectual property” or “intellectual activity”. The concept of “big data” and “artificial intelligence” is only as promising technologies in some future. In the meantime, all these projects have ambitious, but quite common for IT projects, goals, objectives and ways to solve them.



Of course, the development of promising IT, which only the lazy does not speak about, is accepted for government regulation. For example, by the decree of our President dated October 10, 2019 No. 490, the " National Strategy for the Development of Artificial Intelligence for the Period until 2030 " was approved . The authors of the strategy are great optimists, one might say - futurists! To predict the development of computer technologies for 10 years ahead - it is necessary to be very self-confident personalities, or, as it was with Khoja Nasreddin, during this time someone will die (that is, will retire) - “either the emir, or the donkey, or me ... And then go and figure out which of the three of us knew theology better, ”I mean artificial intelligence.



However, the document was made professionally. I recommend it to everyone who is familiar with this topic. It contains my favorite clause "basic concepts" and the definition of the very concept of "artificial intelligence". But the "digital transformation" itself and its other elements mentioned above are not mentioned at all. The strategy specifies the priority areas, goals and main tasks of AI development, which are no different from those in any other areas of information technology. Therefore, there is no clarity about what “artificial intelligence” is.



By the same decree, the Government was instructed to develop and approve another federal project “Artificial Intelligence” within the framework of the national program “Digital Economy of the Russian Federation” until December 15, 2019. But apart from reports in the media, I did not find anything about him. Looks

like something went wrong!



Legislators are also contributing to the development of IT. In Russia, the federal law of 27.07.2006 N 149-FZ "On information, information technology and information protection" is in force. But it is more about "information" than "information technology". Section "Article 2. Basic concepts used in this Federal Law" is rather stingy and, from my point of view, outdated even for 2006. And even as amended on 03/04/2020, the law does not know anything about the digitalization of the country.



According to the portal ComNews, there is a draft law of the Ministry of Telecom and Mass Communications on the regulation of the "big data" market. It deals with amendments to the Law on Information, Information Technologies and Information Protection, introducing new rules for dealing with big data. The bill was criticized and according to the portal TAdviserThe Ministry of Telecom and Mass Communications on June 15, 2020 announced its recall. True, on the website of the Ministry of Telecom and Mass Communications I did not find information about this. The main criticism was that the term “big data” was worded too broadly and this data included any publicly available information, the dissemination and processing of which would be regulated by this law.



As a citizen of the Russian Federation, I can still accept that the President, by his decree, opens funding for some developments, but when the state wants to regulate our attitude to the same “big” and even “small” data, it somehow strains. “Data” is still a technical concept, in contrast to “information”, and there is no need for the state to regulate technical problems.



In general, I positively assess the activities of our executive power in the development of IT. And various documents appear regularly, and bodies for the development and examination of IT projects have been formed from representatives of the state and large IT business, but this is only where democracy ends. How to participate in this process, if you are not part of the elect, is unclear. Applications for subsidies from the budget in the tens and hundreds of millions of rubles are distributed on a competitive basis. Only who saw the information about the opening of the competition and the opportunity to apply? And it is necessary to submit in time to the Ministry of Telecom and Mass Communications of Russia within six days on paper by mail. The term begins on the date the notice is published.



Who cares, "catch mice" here: https://digital.gov.ru/ru/documents/



About standards



For project documentation, references to standards are mandatory, at least I was taught that, and it really was. It would be nice to look in the same place when writing articles. But! There is one but: when on the website of the Federal State Unitary Enterprise "Standartinform" ( http://www.gostinfo.ru/) I did (07/09/2020) a search for "information technology", the system gave me a list of 215 positions! So far I have not been able to understand the system of these documents. They were executed at different times by different organizations and even in different CIS countries, and it seems that without any interaction. Therefore, they often duplicate or repeat each other. Each GOST is a document of tens and hundreds of pages of small text. In terms of content, many of them are more like textbooks than reference books. Accordingly to all this, there is a discrepancy in the same terms. Why was all this piled up? Riddle.



For example, in our main, as I think, standard GOST 33707-2016 (ISO / IEC

2382: 2015)
Information Technology (IT). Dictionary ", 548 pages long, it is written that the standard is intended" to avoid misunderstandings and facilitate the exchange of information, it is necessary to determine the correct interpretation of the concepts and conditions of their use. " In this document, four parts are devoted to artificial intelligence, and "big data", "business analysis" is not mentioned at all. However, like the "digital transformation" itself.



On the topic of artificial intelligence, there is a process of increasing GOST standards for its use in various sectors of the economy. And this is only the beginning, and what lies ahead is scary to imagine. For example, GOST R 43.0.8-2017“Information support of equipment and operator activity. Artificially intellectualized human-information interaction. General provisions ". I recommend reading for the coming dream. I am absolutely sure that I may not know something, and in some areas I am a complete layman, but still I have thirty years of work in the IT industry behind me. But when you study this standard, you feel like a complete insignificance. It seems that this GOST was written by aliens in some kind of Russian-like language, and it is impossible to translate it into human Russian without artificial intelligence!



This year, 2020, a draft standard for “big data” has been developed - GOST R ISO / IEC 20546-2019 “Information technology. Big data. Review and Dictionary ". Authors - National Center for Digital Economy, Moscow State University. This standard should become the basis for other national standards in this area. The standard has not yet been approved. Let's see what happens.



In general, the standards should establish terms and definitions that should become mandatory for use in all types of documentation and literature in the relevant scientific and technical field. In the meantime, this does not happen in practice, there is no generally accepted interpretation and "avoiding misunderstandings and facilitating the exchange of information."



Nevertheless, the standards must be used. And no one forbids us - IT specialists - to find a common language. Let's find it and the standards will get better.



Resume to Continue



So, colleagues, we can state that we are living at the beginning of a new IT era - the era of digital transformation. However, why only the IT era. The transformation has hooked on all spheres of our life, and this is not a fantasy, but a fact, if only because anecdotes on this topic have appeared. You can google or Yandex, for example, "jokes about digital transformation" and get some fun. Not much, but there is. The truth is mostly pessimists joke. Optimists have no time to joke - they have been selling this transformation for a long time at all corners under different sauces or just in a beautiful wrapper. Pessimists call them "digital transformers", which are led by a "transformer boss". I will not go deeper, because everything has already been written, for example, in Iskra

Capital
.



Both those and others are united by one thing - complete freedom of self-expression! And it is based on a different interpretation of the same concepts by both sides.



On Habré (and not only here) I am not the first to ask these questions. Moreover, sometimes the problem is posed quite directly, for example: "Big Data: big opportunities or big deception", "Big data illusion" or " Stop calling everything AI . " But these are sporadic publications that are drowned in texts like "Everything you need to know about AI in a few minutes . "



Doesn't convince! Neither one nor the other convinces. I have a practical question: "Why?" And I can't find the answer. Why “big data” and how is it different from “small”? Why Business Intelligence Programs? After all, it turns out that I have been engaged in the automation of the analysis of data on the activities of enterprises for thirty years and did not even know that I was engaged in

"business analysis". Or is it something else?



Or maybe everything is simple - well, the consumer wants it! And he wants passionately, because they promise mountains of gold. That is why the "investor" carries his money on Forex, although it is known that the probability of a positive result for him is less than 1%? He is passionate about getting results quickly and effortlessly! Why climb Everest? It turned out that this is on our topic. "Digitization is like climbing Mount Everest: cool, but pointless . " The author of these words himself climbs Everests (it turns out, everything is simple - pay and climb wherever you want). The event's mortality rate is 2 out of 10. The Sherpa asks: “Well, you understand that they go and die,” and the Sherpa replies: “Yes, two out of ten people will probably die, but I'm a professional, and two will die, not eight or ten . " “Regarding digital transformation, unfortunately, this is exactly the situation. If you do not have experience, the number of positive examples of successful digital transformation is vanishingly small . "



With "artificial intelligence" it is even more difficult. This is the dream of humanity. But is what we are now being offered intelligence? Could this be some kind of play on words?



Or maybe everything is simple. Here is how A. Wasserman explained it :



“As far as I can judge, we have been observing a fairly clear pattern for about half a century - as soon as we manage to solve some problem related to the field of artificial intelligence, it is immediately deleted from this sphere. This was the case with optical text recognition, this was the case with machine translation, and, most likely, it will be the same with any other similar task. As soon as ways of solving it are found, it will no longer be considered

"intellectual
. "



Or maybe everything is simple again. The thing is that I have never had a need to use "big data technologies", "artificial intelligence" and "business analysis" in projects for automation of enterprise management (both industrial and commercial). All problems were solved by the “old” proven methods: data bank and business process automation. Maybe we really don't need to reinvent the wheel yet, or am I missing something?



Still, practice is the criterion of truth! In the overwhelming majority of texts, both optimists and pessimists do not have this very practice. Therefore, I propose to consider three questions:



1) are these new technologies new ways of solving practical problems, or are they marketing technologies for raising rates in the struggle for colossal finances?



The method of answering this question is to answer the question:



2) are there practical problems that can be solved only with the help of these technologies?



Or the opposite question:



3) is it possible to solve these problems with "traditional" technologies faster or cheaper?



Having asked these questions, I suggest starting with “Big Data” (or “Big Data”).

Why from them? I won't even tell you. It's just that on one of the recent projects I

got sick of this term without any reason. Perhaps it is “big

data” that distinguishes the “new era” from the previous ones. We will see.



It seems that it would be logical to move from big data to "business intelligence" (Business Intelligence or BI, habr.com/ru/post/515976) and also "artificial intelligence".



Finally, let's look at digital transformation. Is there a difference between the old "automator" and the new "transformer"?



For a snack, I'll leave my favorite topic - system design. And is there a difference between an "automated system" and an "information system"? Maybe "automation" is no longer relevant?



For each technology - one article, of course, if it turns out interesting.



Join us.



Yuri Dushin



All Articles