• TECHNOLOGICAL BASIS OF “INDUSTRY 4.0”

    A survey on deep learning in big data analytics

    Industry 4.0, Vol. 5 (2020), Issue 2, pg(s) 68-71

    Over the last few years, Deep learning has begun to play an important role in analytics solutions of big data. Deep learning is one of the most active research fields in machine learning community. It has gained unprecedented achievements in fields such as computer vision, natural language processing and speech recognition. The ability of deep learning to extract high-level complex abstractions and data examples, especially unsupervised data from large volume data, makes it attractive a valuable tool for big data analytics. In this paper, we review the deep learning architectures which can be used for big data processing. Next, we focus on the analysis and discussions about the challenges and possible solutions of deep learning for big data analytics. Finally, have been outlined several open issues and research trends.

  • BUSINESS & “INDUSTRY 4.0”

    CHANGES IN THE APPLICATION OF METHODS AND TECHNIQUES IN THE IMPLEMENTATION OF MANAGERIAL FUNCTIONS IN THE CONTEXT OF THE IMPACT THE FOURTH INDUSTRIAL REVOLUTION

    Industry 4.0, Vol. 4 (2019), Issue 6, pg(s) 306-308

    Tools, which are used while carrying out of the individual managerial functions are constantly evolving and responding to the changes of outer environment. In current conditions the most significant factor, which effects business environment, is Industry 4.0. It is a phenomenon associated mainly with automatization, digitalization and the Internet of Things. The main aim is to identify changes in the application of methods and techniques in carrying out of the individual managerial functions in the context of the impact of the Fourth industrial revolution on the theoretical level, based on a research of available scientific literature. The results of the theoretical research point to the fact that the topic of industry 4.0 is new in the context of management, and the research does not have any influence on the individual managerial functions, which are planning, organizing, controlling, human resources management and leadership. Most of the scientific work focuses on the field planning, organizing and controlling. On the other hand, not mentioned are the functions like human resources management and leadership, within the effects of the Fourth Industrial Revolution on the tools and methods, which are used for the implementation.

  • Secure big data and IoT with implementation of blockchain

    Security & Future, Vol. 2 (2018), Issue 4, pg(s) 183-185

    BlockChain is a distributed database of records or public ledger of all time stamped transactions saved in all computers in one peer-to-peer network. It allows a secure and transparent transfer of digital goods including money and intellectual property. Bitcoin – a digital decentralized cryptocurrency, is the first application of BlockChain. The second application is an agreement called Smart contract that enables exchanging a value or assets between two owners based on a set of conditions included in the contract.

    In this paper, we analyze the possibilities for application of BlockChain in Big Data and IoT. Implementation of BlockChain in Big Data confirms that data is accurate and secure and sharing of data will become more simple. In industries like financial services, government and healthcare there is a need to combine BlockChain and Big Data because these industries have repositories full of important data. They must store and share these large amounts of data. Implementation of BlockChain technology provides security of data and ensures its integrity when shared. BlockChain technology is also seen as a way to secure the Internet of Things (IoT). Application of BlockChain in IoT enables IoT devices to participate in BlockChain transactions and invents new styles of digital interactions. This technology will provide a simple infrastructure for devices to directly and securely transfer data or money using Smart contract.

  • SOCIETY & ”INDUSTRY 4.0”

    SMART CITIES – DEPENDENCE OF INTELLIGENT TRANSPORTATION SYSTEMS ON CLOUD COMPUTING AND TECHNOLOGIES

    Industry 4.0, Vol. 3 (2018), Issue 3, pg(s) 152-156

    With the development of Cloud technologies we finally have the tools and the solutions needed to start planning and executing an efficient urban transportation. The paper presents concepts and ideas toward Smart City intelligent transportation and traffic, namely agent-based traffic management systems and vehicular Cloud computing. It discusses their main characteristics, architecture and provides examples where such technologies are already implemented. Lastly, it outlines some challenges that arise from the application of Cloud computing and the change of the city into a Smart City.

  • TECHNOLOGICAL BASIS OF “INDUSTRY 4.0”

    BIG DATA AGGREGATION ALGORITHM FOR STORING OBSOLETE DATA

    Industry 4.0, Vol. 3 (2018), Issue 1, pg(s) 20-22

    Many contemporary IoT systems do produce a large scale of data. While a new portions of data come to data storage (database etc.) all the previously stored data become obsolete. Most of such obsolete data become excessive and can be needed only to see general trends or anomalies. This research offers an algorithm of data aggregation to minimize the amount of stored obsolete data according to defined business rules. Some modifications of algorithm are discussed to fit different kind of business requirements. There is also a comparison of two methods of data merge in algorithm, quantization and clustering, was made.

  • BUSINESS & “INDUSTRY 4.0”

    SUGGESTED INDICATORS TO MEASURE THE IMPACT OF INDUSTRY 4.0 ON TOTAL QUALITY MANAGEMENT

    Industry 4.0, Vol. 2 (2017), Issue 6, pg(s) 298-301

    The development of “Smart Factories”, featured by the arrival of “Internet of Things”, “Cyber-Physical Systems”, “Cloud Computing”, “Big Data” etc., became widely deployed at the industrialized economies. Several researches highlighted the impact of utilizing such technologies (so-called Industry 4.0) on the industry; i.e. enhancement of products’ quality, manufacturing processes, and customers’ satisfaction. However, very few researchers focused on determining the impact of Industry 4.0 on enhancing the practice of Total Quality Management (TQM). This paper identified the set of qualitative and quantitative measures that can be used to determine the impact of implementing Industry 4.0 technologies at any industrial firm from a TQM perspective. The paper explored the TQM principles, identified qualitative and quantitative measures to be assessed, and suggested the means of data gathering sources and analysis techniques, hence, it would be possible in further research to determine the quantitative impact of Industry 4.0 on TQM

  • SCIENCE

    MODEL FOR A SMART AND SAFE CITY

    Science. Business. Society., Vol. 1 (2016), Issue 3, pg(s) 6-10

    The idea of smart cities is timely considering that urbanisation is inevitable. While smart city as a concept has gained popularity over the past few years, there is vagueness in the definition, as multiple aspects including governance, public transport and traffic, waste management, entertainment and safety among others, need to be considered.

    Europe is among the most urbanised regions on the globe. It is estimated that by 2020 around 80% of Europeans will be living in urban areas, in several countries the proportion will be 90% or more. As we continue to magnetise towards urban hubs we need smart cities – places where networks and services are made more efficient with the use of digital and telecommunication technologies for the benefit of inhabitants, businesses and the environment. The EU is trying to ensure that smart solutions for cities can be explored, implemented and replicated.

    The present paper defines some of the aspects of the smart city as safety as well as specific models of smart and safe city will be discussed.

  • TECHNOLOGIES

    A MAPREDUCE SOLUTION FOR HANDLING LARGE DATA EFFICIENTLY

    Machines. Technologies. Materials., Vol. 10 (2016), Issue 12, pg(s) 20-23

    Big data is large volume, heterogeneous, distributed data. Big data applications where data collection has grown continuously, it is expensive to manage, capture or extract and process data using existing software tools. With increasing size of data in data warehouse it is expensive to perform data analysis. In recent years, numbers of computation and data intensive scientific data analyses are established. To perform the large scale data mining analyses so as to meet the scalability and performance requirements of big data, several efficient parallel and concurrent algorithms got applied. For data processing, Big data processing framework relay on cluster computers and parallel execution framework provided by MapReduce. MapReduce is a parallel programming model and an associated implementation for processing and generating large data sets. In this paper, we are going to work around MapReduce, use a MapReduce solution for handling large data efficiently, its advantages, disadvantages and how it can be used in integration with other technology.