Search results for: enterprise automation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 977

Search results for: enterprise automation

707 A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes

Authors: Tim Giesen, Raphael Adamietz, Pablo Mayer, Philipp Stiefel, Patrick Alle, Dirk Schlenker

Abstract:

The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.

Keywords: automation, battery production, carrier, advanced process control, cyber-physical system

Procedia PDF Downloads 303
706 Economics of Fish-Plantain Integrated Farm Enterprise in Southern Nigeria

Authors: S. O. Obasa, J. A. Soaga, O. I. Afolabi, N. A. Bamidele, O. E. Babalola

Abstract:

Attempt to improve the income of the rural population is a welcome development in Nigeria. Integrated fish-crop farming has been suggested as a means of raising farm income, reducing wastage and mitigating the risk component in production through the complementarity gain. A feeding trial was carried out to investigate the replacement of maize with fermented unripe plantain (Musa paradisiaca) peel meal in the diet of Nile tilapia, Oreochromis niloticus. The economics of the integrated enterprise was assessed using budgetary analysis techniques. The analysis incorporated the material and labour costs as well as the returns from sale of matured fish and plantain. A total of 60 fingerlings of Nile tilapia (1.70±0.1 g) were stocked at 10 per plastic tank. Two iso-nitrogenous diets containing 35% crude protein in which maize meal was replaced by fermented unripe plantain peel meal at 0% (FUP0/Control diet), and 100% (FUP100) were formulated and prepared. The fingerlings were fed at 5% body weight per day for 56 days. Lowest feed conversion ratio of 1.39 in fish fed diet FUP100 was not significantly different (P > 0.05) from the highest 1.42 of fish fed the Control diet. The highest percentage profit of 88.85% in fish fed diet FUP100 was significantly higher than 66.68% in fish fed diet FUP0, while the profit index of 1.89 in fish fed diet FUP100 was significantly different from 1.67 in fish fed diet FUP0. Therefore, fermented unripe plantain peel meal can completely replace maize in the diet of O. niloticus fingerlings. Profitability assessment shows that the net income from the integration was ₦ 463,000 per hectare and the integration resulted to an increase of ₦ 87,750.00 representing a 12.2% increase than in separate production.

Keywords: fish-crop, income, Nile tilapia, waste management

Procedia PDF Downloads 465
705 Human Resource Information System: Role in HRM Practices and Organizational Performance

Authors: Ejaz Ali M. Phil

Abstract:

Enterprise Resource Planning (ERP) systems are playing a vital role in effective management of business functions in large and complex organizations. Human Resource Information System (HRIS) is a core module of ERP, providing concrete solutions to implement Human Resource Management (HRM) Practices in an innovative and efficient manner. Over the last decade, there has been considerable increase in the studies on HRIS. Nevertheless, previous studies relatively lacked to examine the moderating role of HRIS in performing HRM practices that may affect the firms’ performance. The current study was carried out to examine the impact of HRM practices (training, performance appraisal) on perceived organizational performance, with moderating role of HRIS, where the system is in place. The study based on Resource Based View (RBV) and Ability Motivation Opportunity (AMO) Theories, advocating that strengthening of human capital enables an organization to achieve and sustain competitive advantage which leads to improved organizational performance. Data were collected through structured questionnaire based upon adopted instruments after establishing reliability and validity. The structural equation modeling (SEM) were used to assess the model fitness, hypotheses testing and to establish validity of the instruments through Confirmatory Factor Analysis (CFA). A total 220 employees of 25 firms in corporate sector were sampled through non-probability sampling technique. Path analysis revealing that HRM practices and HRIS have significant positive impact on organizational performance. The results further showed that the HRIS moderated the relationships between training, performance appraisal and organizational performance. The interpretation of the findings and limitations, theoretical and managerial implications are discussed.

Keywords: enterprise resource planning, human resource, information system, human capital

Procedia PDF Downloads 365
704 Automated Manual Handling Risk Assessments: Practitioner Experienced Determinants of Automated Risk Analysis and Reporting Being a Benefit or Distraction

Authors: S. Cowley, M. Lawrance, D. Bick, R. McCord

Abstract:

Technology that automates manual handling (musculoskeletal disorder or MSD) risk assessments is increasingly available to ergonomists, engineers, generalist health and safety practitioners alike. The risk assessment process is generally based on the use of wearable motion sensors that capture information about worker movements for real-time or for posthoc analysis. Traditionally, MSD risk assessment is undertaken with the assistance of a checklist such as that from the SafeWork Australia code of practice, the expert assessor observing the task and ideally engaging with the worker in a discussion about the detail. Automation enables the non-expert to complete assessments and does not always require the assessor to be there. This clearly has cost and time benefits for the practitioner but is it an improvement on the assessment by the human. Human risk assessments draw on the knowledge and expertise of the assessor but, like all risk assessments, are highly subjective. The complexity of the checklists and models used in the process can be off-putting and sometimes will lead to the assessment becoming the focus and the end rather than a means to an end; the focus on risk control is lost. Automated risk assessment handles the complexity of the assessment for the assessor and delivers a simple risk score that enables decision-making regarding risk control. Being machine-based, they are objective and will deliver the same each time they assess an identical task. However, the WHS professional needs to know that this emergent technology asks the right questions and delivers the right answers. Whether it improves the risk assessment process and results or simply distances the professional from the task and the worker. They need clarity as to whether automation of manual task risk analysis and reporting leads to risk control or to a focus on the worker. Critically, they need evidence as to whether automation in this area of hazard management leads to better risk control or just a bigger collection of assessments. Practitioner experienced determinants of this automated manual task risk analysis and reporting being a benefit or distraction will address an understanding of emergent risk assessment technology, its use and things to consider when making decisions about adopting and applying these technologies.

Keywords: automated, manual-handling, risk-assessment, machine-based

Procedia PDF Downloads 92
703 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin is an emerging research topic that attracted researchers in the last decade. It is used in many fields, such as smart manufacturing and smart healthcare because it saves time and money. It is usually related to other technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, Human digital twin (HDT), in specific, is still a novel idea that still needs to prove its feasibility. HDT expands the idea of Digital Twin to human beings, which are living beings and different from the inanimate physical entities. The goal of this research was to create a Human digital twin that is responsible for real-time human replies automation by simulating human behavior. For this reason, clustering, supervised classification, topic extraction, and sentiment analysis were studied in this paper. The feasibility of the HDT for personal replies generation on social messaging applications was proved in this work. The overall accuracy of the proposed approach in this paper was 63% which is a very promising result that can open the way for researchers to expand the idea of HDT. This was achieved by using Random Forest for clustering the question data base and matching new questions. K-nearest neighbor was also applied for sentiment analysis.

Keywords: human digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification, clustering

Procedia PDF Downloads 62
702 Formation of the Investment Portfolio of Intangible Assets with a Wide Pairwise Comparison Matrix Application

Authors: Gulnara Galeeva

Abstract:

The Analytic Hierarchy Process is widely used in the economic and financial studies, including the formation of investment portfolios. In this study, a generalized method of obtaining a vector of priorities for the case with separate pairwise comparisons of the expert opinion being presented as a set of several equal evaluations on a ratio scale is examined. The author claims that this method allows solving an important and up-to-date problem of excluding vagueness and ambiguity of the expert opinion in the decision making theory. The study describes the authentic wide pairwise comparison matrix. Its application in the formation of the efficient investment portfolio of intangible assets of a small business enterprise with limited funding is considered. The proposed method has been successfully approbated on the practical example of a functioning dental clinic. The result of the study confirms that the wide pairwise comparison matrix can be used as a simple and reliable method for forming the enterprise investment policy. Moreover, a comparison between the method based on the wide pairwise comparison matrix and the classical analytic hierarchy process was conducted. The results of the comparative analysis confirm the correctness of the method based on the wide matrix. The application of a wide pairwise comparison matrix also allows to widely use the statistical methods of experimental data processing for obtaining the vector of priorities. A new method is available for simple users. Its application gives about the same accuracy result as that of the classical hierarchy process. Financial directors of small and medium business enterprises get an opportunity to solve the problem of companies’ investments without resorting to services of analytical agencies specializing in such studies.

Keywords: analytic hierarchy process, decision processes, investment portfolio, intangible assets

Procedia PDF Downloads 237
701 Promoting Personhood and Citizenship Amongst Individuals with Learning Disabilities: An Occupational Therapy Approach

Authors: Rebecca Haythorne

Abstract:

Background: Agendas continuously emphasise the need to increase work based training and opportunities for individuals with learning disabilities. However research and statistics suggest that there is still significant stigma and stereotypes as to what they can contribute, or gain from being part of the working environment. Method: To tackles some of these prejudices an Occupational Therapy based intervention was developed for learning disability service users working at a social enterprise farm. The intervention aimed to increase positive public perception around individual capabilities and encourage individuals with learning disabilities to take ownership and be proud of their individual personhood and citizenship. This was achieved by using components of the Model of Human Occupation to tailor the intervention to individual values, skills and working contributions. The final project involved making creative wall art for public viewing, focusing on 'who works there and what they do'. This was accompanied by a visitor information guide, allowing individuals to tell visitors about themselves, the work they do and why it is meaningful to them. Outcomes: The intervention has helped to increased metal well-being and confidence of learning disability service users “people will know I work here now” and “I now have something to show my family about the work I do at the farm”. The intervention has also increased positive public perception and community awareness “you can really see the effort that’s gone into doing this” and “it’s a really visual experience to see people you don’t expect to see doing this type of work”. Resources left behind have further supported individuals to take ownership in creating more wall art to be sold at the farm shop. Conclusion: the intervention developed has helped to improve mental well-being of both service users and staff and improve community awareness. Due to this, the farm has decided to roll out the intervention to other areas of the social enterprise and is considering having more Occupational Therapy involvement in the future.

Keywords: citizenship, intervention, occupational therapy, personhood

Procedia PDF Downloads 431
700 Adding Business Value in Enterprise Applications through Quality Matrices Using Agile

Authors: Afshan Saad, Muhammad Saad, Shah Muhammad Emaduddin

Abstract:

Nowadays the business condition is so quick paced that enhancing ourselves consistently has turned into a huge factor for the presence of an undertaking. We can check this for structural building and significantly more so in the quick-paced universe of data innovation and programming designing. The lithe philosophies, similar to Scrum, have a devoted advance in the process that objectives the enhancement of the improvement procedure and programming items. Pivotal to process enhancement is to pick up data that grants you to assess the condition of the procedure and its items. From the status data, you can design activities for the upgrade and furthermore assess the accomplishment of those activities. This investigation builds a model that measures the product nature of the improvement procedure. The product quality is dependent on the useful and auxiliary nature of the product items, besides the nature of the advancement procedure is likewise vital to enhance programming quality. Utilitarian quality covers the adherence to client prerequisites, while the auxiliary quality tends to the structure of the product item's source code with reference to its practicality. The procedure quality is identified with the consistency and expectedness of the improvement procedure. The product quality model is connected in a business setting by social occasion the information for the product measurements in the model. To assess the product quality model, we investigate the information and present it to the general population engaged with the light-footed programming improvement process. The outcomes from the application and the client input recommend that the model empowers a reasonable evaluation of the product quality and that it very well may be utilized to help the persistent enhancement of the advancement procedure and programming items.

Keywords: Agile SDLC Tools, Agile Software development, business value, enterprise applications, IBM, IBM Rational Team Concert, RTC, software quality, software metrics

Procedia PDF Downloads 142
699 Another Justice: Litigation Masters in Chinese Legal Story

Authors: Lung-Lung Hu

Abstract:

Ronald Dworkin offered a legal theory of ‘chain enterprise’ that all the judges in legal history altogether create a ‘law’ aiming a specific purpose. Those judges are like co-writers of a chain-story who not only create freely but also are constrained by the story made by the judges before them. The law created by Chinese traditional judges is another case, they, compared with the judges mentioned by Ronald Dworkin, have relatively narrower space of making a legal sentence according to their own discretions because the statutes in Chinese traditional law at the very beginning have been designed as panel code that leaves small room to judge’s discretion. Furthermore, because law is a representative of the authority of the government, i.e. the emperor, any misjudges and misuses deviated from the law will be considered as a challenge to the supreme power. However, different from judges as the defenders of law, Chinese litigation masters who want to win legal cases have to be offenders challenging the verdict that does not favor his or his client’s interest. Besides, litigation master as an illegal or non-authorized profession does not belong to any legal system, therefore, they are relatively freer to ‘create’ the law. According to Stanley Fish’s articles that question Ronald Dworkin and Owen Fiss’ ideas about law, he construes that, since law is made of language, law is open to interpretations that cannot be constrained by any rules or any particular legal purposes. Stanley Fish’s idea can also be applied on the analysis about the stories of Chinese litigation masters in traditional Chinese literature. These Chinese litigation masters’ legal opinions in the so-called chain enterprise are like an unexpected episode that tries to revise the fixed story told by law. Although they are not welcome to the officials and also to the society, their existence is still a phenomenon representing another version of justice different from the official’s and can be seen as a de-structural power to the government. Hence, in this present paper the language and strategy applied by Chinese litigation masters in Chinese legal stories will be analysed to see how they refute made legal judgments and challenge the official standard of justice.

Keywords: Chinese legal stories, interdisciplinary, litigation master, post-structuralism

Procedia PDF Downloads 358
698 DesignChain: Automated Design of Products Featuring a Large Number of Variants

Authors: Lars Rödel, Jonas Krebs, Gregor Müller

Abstract:

The growing price pressure due to the increasing number of global suppliers, the growing individualization of products and ever-shorter delivery times are upcoming challenges in the industry. In this context, Mass Personalization stands for the individualized production of customer products in batch size 1 at the price of standardized products. The possibilities of digitalization and automation of technical order processing open up the opportunity for companies to significantly reduce their cost of complexity and lead times and thus enhance their competitiveness. Many companies already use a range of CAx tools and configuration solutions today. Often, the expert knowledge of employees is hidden in "knowledge silos" and is rarely networked across processes. DesignChain describes the automated digital process from the recording of individual customer requirements, through design and technical preparation, to production. Configurators offer the possibility of mapping variant-rich products within the Design Chain. This transformation of customer requirements into product features makes it possible to generate even complex CAD models, such as those for large-scale plants, on a rule-based basis. With the aid of an automated CAx chain, production-relevant documents are thus transferred digitally to production. This process, which can be fully automated, allows variants to always be generated on the basis of current version statuses.

Keywords: automation, design, CAD, CAx

Procedia PDF Downloads 47
697 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation

Authors: Rashmi Malik, Videep Mishra

Abstract:

The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.

Keywords: iterative game design, generative design, gaming asset automation, generative game design

Procedia PDF Downloads 40
696 The Role of Robotization in Reshoring: An Overview of the Implications on International Trade

Authors: Thinh Huu Nguyen, Shahab Sharfaei, Jindřich Soukup

Abstract:

In the pursuit of reducing production costs, offshoring has been a major trend throughout global value chains for many decades. However, with the rise of advanced technologies, new opportunities to automate their production are changing the motivation of multinational firms to go offshore. Instead, many firms are working to relocate their offshored activities from developing economies back to their home countries. This phenomenon, known as reshoring, has recently garnered much attention as it becomes clear that automation in advanced countries might have major implications not only on their own economies but also through international trade on the economy of low-income countries, including their labor market outcomes and their comparative advantages. Thus, while using robots to substitute human labor may lower the relative costs of producing at home, it has the potential to decrease employment and demand for exports from developing economies through reshoring. In this paper, we investigate the recent literature to provide a further understanding of the relationships between robotization and the reshoring of production. Moreover, we analyze the impact of robot adoption on international trade in both developed and emerging markets. Finally, we identify the research gaps and provide avenues for future research in international economics. This study is a part of the project funded by the Internal Grant Agency (IGA) of the Faculty of Business Administration, Prague University of Economics and Business.

Keywords: automation, robotization, reshoring, international trade

Procedia PDF Downloads 74
695 Automated Irrigation System with Programmable Logic Controller and Photovoltaic Energy

Authors: J. P. Reges, L. C. S. Mazza, E. J. Braga, J. A. Bessa, A. R. Alexandria

Abstract:

This paper proposes the development of control and automation of irrigation system located sunflower harvest in the Teaching Unit, Research and Extension (UEPE), the Apodi Plateau in Limoeiro do Norte. The sunflower extraction, which in turn serves to get the produced oil from its seeds, animal feed, and is widely used in human food. Its nutritional potential is quite high what makes of foods produced from vegetal, very rich and healthy. The focus of research is to make the autonomous irrigation system sunflower crop from programmable logic control energized with alternative energy sources, solar photovoltaics. The application of automated irrigation system becomes interesting when it provides convenience and implements new forms of managements of the implementation of irrigated cropping systems. The intended use of automated addition to irrigation quality and consequently brings enormous improvement for production of small samples. Addition to applying the necessary and sufficient features of water management in irrigation systems, the system (PLC + actuators + Renewable Energy) will enable to manage the quantitative water required for each crop, and at the same time, insert the use of sources alternative energy. The entry of the automated collection will bring a new format, and in previous years, used the process of irrigation water wastage base and being the whole manual irrigation process.

Keywords: automation, control, sunflower, irrigation, programming, renewable energy

Procedia PDF Downloads 381
694 Fractional, Component and Morphological Composition of Ambient Air Dust in the Areas of Mining Industry

Authors: S.V. Kleyn, S.Yu. Zagorodnov, А.А. Kokoulina

Abstract:

Technogenic emissions of the mining and processing complex are characterized by a high content of chemical components and solid dust particles. However, each industrial enterprise and the surrounding area have features that require refinement and parameterization. Numerous studies have shown the negative impact of fine dust PM10 and PM2.5 on the health, as well as the possibility of toxic components absorption, including heavy metals by dust particles. The target of the study was the quantitative assessment of the fractional and particle size composition of ambient air dust in the area of impact by primary magnesium production complex. Also, we tried to describe the morphology features of dust particles. Study methods. To identify the dust emission sources, the analysis of the production process has been carried out. The particulate composition of the emissions was measured using laser particle analyzer Microtrac S3500 (covered range of particle size is 20 nm to 2000 km). Particle morphology and the component composition were established by electron microscopy by scanning microscope of high resolution (magnification rate - 5 to 300 000 times) with X-ray fluorescence device S3400N ‘HITACHI’. The chemical composition was identified by X-ray analysis of the samples using an X-ray diffractometer XRD-700 ‘Shimadzu’. Determination of the dust pollution level was carried out using model calculations of emissions in the atmosphere dispersion. The calculations were verified by instrumental studies. Results of the study. The results demonstrated that the dust emissions of different technical processes are heterogeneous and fractional structure is complicated. The percentage of particle sizes up to 2.5 micrometres inclusive was ranged from 0.00 to 56.70%; particle sizes less than 10 microns inclusive – 0.00 - 85.60%; particle sizes greater than 10 microns - 14.40% -100.00%. During microscopy, the presence of nanoscale size particles has been detected. Studied dust particles are round, irregular, cubic and integral shapes. The composition of the dust includes magnesium, sodium, potassium, calcium, iron, chlorine. On the base of obtained results, it was performed the model calculations of dust emissions dispersion and establishment of the areas of fine dust РМ 10 and РМ 2.5 distribution. It was found that the dust emissions of fine powder fractions PM10 and PM2.5 are dispersed over large distances and beyond the border of the industrial site of the enterprise. The population living near the enterprise is exposed to the risk of diseases associated with dust exposure. Data are transferred to the economic entity to make decisions on the measures to minimize the risks. Exposure and risks indicators on the health are used to provide named patient health and preventive care to the citizens living in the area of negative impact of the facility.

Keywords: dust emissions, еxposure assessment, PM 10, PM 2.5

Procedia PDF Downloads 234
693 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 147
692 A New Model for Production Forecasting in ERP

Authors: S. F. Wong, W. I. Ho, B. Lin, Q. Huang

Abstract:

ERP has been used in many enterprises for management, the accuracy of the production forecasting module is vital to the decision making of the enterprise, and the profit is affected directly. Therefore, enhancing the accuracy of the production forecasting module can also increase the efficiency and profitability. To deal with a lot of data, a suitable, reliable and accurate statistics model is necessary. LSSVM and Grey System are two main models to be studied in this paper, and a case study is used to demonstrate how the combination model is effective to the result of forecasting.

Keywords: ERP, grey system, LSSVM, production forecasting

Procedia PDF Downloads 418
691 Substation Automation, Digitization, Cyber Risk and Chain Risk Management Reliability

Authors: Serzhan Ashirov, Dana Nour, Rafat Rob, Khaled Alotaibi

Abstract:

There has been a fast growth in the introduction and use of communications, information, monitoring, and sensing technologies. The new technologies are making their way to the Industrial Control Systems as embedded in products, software applications, IT services, or commissioned to enable integration and automation of increasingly global supply chains. As a result, the lines that separated the physical, digital, and cyber world have diminished due to the vast implementation of the new, disruptive digital technologies. The variety and increased use of these technologies introduce many cybersecurity risks affecting cyber-resilience of the supply chain, both in terms of the product or service delivered to a customer and members of the supply chain operation. US department of energy considers supply chain in the IR4 space to be the weakest link in cybersecurity. The IR4 identified the digitization of the field devices, followed by digitalization that eventually moved through the digital transformation space with little care for the new introduced cybersecurity risks. This paper will examine the best methodologies for securing the electrical substations from cybersecurity attacks due to supply chain risks, and due to digitization effort. SCADA systems are the most vulnerable part of the power system infrastructure due to digitization and due to the weakness and vulnerabilities in the supply chain security. The paper will discuss in details how create a secure supply chain methodology, secure substations, and mitigate the risks due to digitization

Keywords: cybersecurity, supply chain methodology, secure substation, digitization

Procedia PDF Downloads 38
690 A Modernist Project: An Analysis on Dupont’s Translations of Faulkner’s Works

Authors: Edilei Reis, Jose Carlos Felix

Abstract:

This paper explores Waldir Dupont’s translations of William Faulkner’s novels to Brazilian Portuguese language in order to comprehend how his translation project regarding Faulkner’s works has addressed modernist traits of the novelist fiction, particularly the ambivalence of language, multiple and fragmented points of view and syntax. Wladir Dupont (1939-2014) was a prolific Brazilian journalist who benefitted from his experiences as an international correspondent living abroad (EUA and Mexico) to become an acclaimed translator later in life. He received a Jabuiti Award (Brazilian most prestigious literary award) for his translation of ‘La Otra Voz’ (1994), by Mexican poet, critic and translator Octavio Paz, a writer to whom he devoted the first years of his carrier as a translator. As Dupont pointed out in some interviews, the struggles in finding a way out to overcome linguistic and cultural obstacles in the process of translating texts from Spanish to Portuguese was paramount for ascertaining his engagement in the long-term project of translating to Brazilian Portuguese the fiction of William Faulkner. His first enterprise was the translation of Faulkner’s trilogy Snopes: The Hamlet (1940) and The Town (1957), the first two novels, were published in 1997 as O povoado and A cidade; in 1999 the last novel, The mansion (1959), was published as A mansão. In 2001, Dupont tackled what is considered one of the most challenging novels by the author due to his use of multiple points of view, As I lay dying (1930). In 2003, The Reivers (1962) was published under the title Os invictos. His enterprise finishes in 2012 with the publication of an anthology of Faulkner’s thriller short-stories Knight’s Gambit (1932) as Lance mortal. Hence, in this paper we will consider the Dupont’s trajectory as a translator, paying special attention to the way in which his identity as such is constituted through the process of translating Faulkner’s works.

Keywords: literary translation, translator’s identity, William Faulkner, Wladir DuPont

Procedia PDF Downloads 210
689 Lockit: A Logic Locking Automation Software

Authors: Nemanja Kajtez, Yue Zhan, Basel Halak

Abstract:

The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).

Keywords: design automation, hardware security, IP piracy, logic locking

Procedia PDF Downloads 149
688 Method for Assessing Potential in Distribution Logistics

Authors: B. Groß, P. Fronia, P. Nyhuis

Abstract:

In addition to the production, which is already frequently optimized, improving the distribution logistics also opens up tremendous potential for increasing an enterprise’s competitiveness. Here too though, numerous interactions need to be taken into account, enterprises thus need to be able to identify and weigh between different potentials for economically efficient optimizations. In order to be able to assess potentials, enterprises require a suitable method. This paper first briefly presents the need for this research before introducing the procedure that will be used to develop an appropriate method that not only considers interactions but is also quickly and easily implemented.

Keywords: distribution logistics, evaluation of potential, methods, model

Procedia PDF Downloads 474
687 Financial Performance Model of Local Economic Enterprises in Matalam, Cotabato

Authors: Kristel Faye Tandog

Abstract:

The State Owned Enterprise (SOE) or also called Public Enterprise (PE) has been playing a vital role in a country’s social and economic development. Following this idea, this study focused on the Factor Structures of Financial Performance of the Local Economic Enterprises (LEEs) namely: Food Court, Market, Slaughterhouse, and Terminal in Matalam, Cotabato. It aimed to determine the profile of the LEEs in terms of organizational structure, manner of creation, years in operation, source of initial operating requirements, annual operating budget, geographical location, and size or description of the facility. This study also included the different financial ratios of LEE that covered a five year period from Calendar Year 2009 to 2013. Primary data using survey questionnaire was administered to 468 respondents and secondary data were sourced out from the government archives and financial documents of the said LGU. There were 12 dominant factors identified namely: “management”, “enforcement of laws”, “strategic location”, “existence of non-formal competitors”, “proper maintenance”, “pricing”, “customer service”, “collection process”, “rentals and services”, “efficient use of resources”, “staffing”, and “timeliness and accuracy”. On the other hand, the financial performance of the LEE of Matalam, Cotabato using financial ratios needs reformatting. This denotes that refinement as to the following ratios: Cash Flow Indicator, Activity, Profitability and Growth is necessary. The cash flow indicator ratio showed difficulty in covering its debts in successive years. Likewise, the activity ratios showed that the LEE had not been effective in putting its investment at work. Moreover, profitability ratios revealed that it had operated in minimum capacity and had incurred net losses and thus, it had a weak profit performance. Furthermore, growth ratios showed that LEE had a declining growth trend particularly in net income.

Keywords: factor structures, financial performance, financial ratios, state owned enterprises

Procedia PDF Downloads 228
686 Fueling Efficient Reporting And Decision-Making In Public Health With Large Data Automation In Remote Areas, Neno Malawi

Authors: Wiseman Emmanuel Nkhomah, Chiyembekezo Kachimanga, Julia Huggins, Fabien Munyaneza

Abstract:

Background: Partners In Health – Malawi introduced one of Operational Researches called Primary Health Care (PHC) Surveys in 2020, which seeks to assess progress of delivery of care in the district. The study consists of 5 long surveys, namely; Facility assessment, General Patient, Provider, Sick Child, Antenatal Care (ANC), primarily conducted in 4 health facilities in Neno district. These facilities include Neno district hospital, Dambe health centre, Chifunga and Matope. Usually, these annual surveys are conducted from January, and the target is to present final report by June. Once data is collected and analyzed, there are a series of reviews that take place before reaching final report. In the first place, the manual process took over 9 months to present final report. Initial findings reported about 76.9% of the data that added up when cross-checked with paper-based sources. Purpose: The aim of this approach is to run away from manually pulling the data, do fresh analysis, and reporting often associated not only with delays in reporting inconsistencies but also with poor quality of data if not done carefully. This automation approach was meant to utilize features of new technologies to create visualizations, reports, and dashboards in Power BI that are directly fished from the data source – CommCare hence only require a single click of a ‘refresh’ button to have the updated information populated in visualizations, reports, and dashboards at once. Methodology: We transformed paper-based questionnaires into electronic using CommCare mobile application. We further connected CommCare Mobile App directly to Power BI using Application Program Interface (API) connection as data pipeline. This provided chance to create visualizations, reports, and dashboards in Power BI. Contrary to the process of manually collecting data in paper-based questionnaires, entering them in ordinary spreadsheets, and conducting analysis every time when preparing for reporting, the team utilized CommCare and Microsoft Power BI technologies. We utilized validations and logics in CommCare to capture data with less errors. We utilized Power BI features to host the reports online by publishing them as cloud-computing process. We switched from sharing ordinary report files to sharing the link to potential recipients hence giving them freedom to dig deep into extra findings within Power BI dashboards and also freedom to export to any formats of their choice. Results: This data automation approach reduced research timelines from the initial 9 months’ duration to 5. It also improved the quality of the data findings from the original 76.9% to 98.9%. This brought confidence to draw conclusions from the findings that help in decision-making and gave opportunities for further researches. Conclusion: These results suggest that automating the research data process has the potential of reducing overall amount of time spent and improving the quality of the data. On this basis, the concept of data automation should be taken into serious consideration when conducting operational research for efficiency and decision-making.

Keywords: reporting, decision-making, power BI, commcare, data automation, visualizations, dashboards

Procedia PDF Downloads 80
685 Material Handling Equipment Selection Using Fuzzy AHP Approach

Authors: Priyanka Verma, Vijaya Dixit, Rishabh Bajpai

Abstract:

This research paper is aimed at selecting appropriate material handling equipment among the given choices so that the automation level in material handling can be enhanced. This work is a practical case scenario of material handling systems in consumer electronic appliances manufacturing organization. The choices of material handling equipment among which the decision has to be made are Automated Guided Vehicle’s (AGV), Autonomous Mobile Robots (AMR), Overhead Conveyer’s (OC) and Battery Operated Trucks/Vehicle’s (BOT). There is a need of attaining a certain level of automation in order to reduce human interventions in the organization. This requirement of achieving certain degree of automation can be attained by material handling equipment’s mentioned above. The main motive for selecting above equipment’s for study was solely based on corporate financial strategy of investment and return obtained through that investment made in stipulated time framework. Since the low cost automation with respect to material handling devices has to be achieved hence these equipment’s were selected. Investment to be done on each unit of this equipment is less than 20 lakh rupees (INR) and the recovery period is less than that of five years. Fuzzy analytic hierarchic process (FAHP) is applied here for selecting equipment where the four choices are evaluated on basis of four major criteria’s and 13 sub criteria’s, and are prioritized on the basis of weight obtained. The FAHP used here make use of triangular fuzzy numbers (TFN). The inability of the traditional AHP in order to deal with the subjectiveness and impreciseness in the pair-wise comparison process has been improved in the FAHP. The range of values for general rating purposes for all decision making parameters is kept between 0 and 1 on the basis of expert opinions captured on shop floor. These experts were familiar with operating environment and shop floor activity control. Instead of generating exact value the FAHP generates the ranges of values to accommodate the uncertainty in decision-making process. The four major criteria’s selected for the evaluation of choices of material handling equipment’s available are materials, technical capabilities, cost and other features. The thirteen sub criteria’s listed under these following four major criteria’s are weighing capacity, load per hour, material compatibility, capital cost, operating cost and maintenance cost, speed, distance moved, space required, frequency of trips, control required, safety and reliability issues. The key finding shows that among the four major criteria selected, cost is emerged as the most important criteria and is one of the key decision making aspect on the basis of which material equipment selection is based on. While further evaluating the choices of equipment available for each sub criteria it is found that AGV scores the highest weight in most of the sub-criteria’s. On carrying out complete analysis the research shows that AGV is the best material handling equipment suiting all decision criteria’s selected in FAHP and therefore it is beneficial for the organization to carry out automated material handling in the facility using AGV’s.

Keywords: fuzzy analytic hierarchy process (FAHP), material handling equipment, subjectiveness, triangular fuzzy number (TFN)

Procedia PDF Downloads 409
684 Factors Affecting Employee Decision Making in an AI Environment

Authors: Yogesh C. Sharma, A. Seetharaman

Abstract:

The decision-making process in humans is a complicated system influenced by a variety of intrinsic and extrinsic factors. Human decisions have a ripple effect on subsequent decisions. In this study, the scope of human decision making is limited to employees. In an organisation, a person makes a variety of decisions from the time they are hired to the time they retire. The goal of this research is to identify various elements that influence decision-making. In addition, the environment in which a decision is made is a significant aspect of the decision-making process. Employees in today's workplace use artificial intelligence (AI) systems for automation and decision augmentation. The impact of AI systems on the decision-making process is examined in this study. This research is designed based on a systematic literature review. Based on gaps in the literature, limitations and the scope of future research have been identified. Based on these findings, a research framework has been designed to identify various factors affecting employee decision making. Employee decision making is influenced by technological advancement, data-driven culture, human trust, decision automation-augmentation, and workplace motivation. Hybrid human-AI systems require the development of new skill sets and organisational design. Employee psychological safety and supportive leadership influences overall job satisfaction.

Keywords: employee decision making, artificial intelligence (AI) environment, human trust, technology innovation, psychological safety

Procedia PDF Downloads 77
683 General Architecture for Automation of Machine Learning Practices

Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain

Abstract:

Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.

Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler

Procedia PDF Downloads 25
682 Distributed Automation System Based Remote Monitoring of Power Quality Disturbance on LV Network

Authors: Emmanuel D. Buedi, K. O. Boateng, Griffith S. Klogo

Abstract:

Electrical distribution networks are prone to power quality disturbances originating from the complexity of the distribution network, mode of distribution (overhead or underground) and types of loads used by customers. Data on the types of disturbances present and frequency of occurrence is needed for economic evaluation and hence finding solution to the problem. Utility companies have resorted to using secondary power quality devices such as smart meters to help gather the required data. Even though this approach is easier to adopt, data gathered from these devices may not serve the required purpose, since the installation of these devices in the electrical network usually does not conform to available PQM placement methods. This paper presents a design of a PQM that is capable of integrating into an existing DAS infrastructure to take advantage of available placement methodologies. The monitoring component of the design is implemented and installed to monitor an existing LV network. Data from the monitor is analyzed and presented. A portion of the LV network of the Electricity Company of Ghana is modeled in MATLAB-Simulink and analyzed under various earth fault conditions. The results presented show the ability of the PQM to detect and analyze PQ disturbance such as voltage sag and overvoltage. By adopting a placement methodology and installing these nodes, utilities are assured of accurate and reliable information with respect to the quality of power delivered to consumers.

Keywords: power quality, remote monitoring, distributed automation system, economic evaluation, LV network

Procedia PDF Downloads 322
681 Privatising Higher Education: Imparting Quality in Academics

Authors: Manish Khanna

Abstract:

Higher education seeks to preserve, transmit and advance knowledge. It is one of the most important instruments of change and progress. The observation of Kothari Commission (1964-66) is true even today; The destiny of India is now being shaped in her classrooms. This, we believe, is no more rhetoric. In the world based on science and technology it is education that determines the level of prosperity, welfare, and security of the people. On the quality and number of persons coming out of our schools and colleges will depend our success in the great enterprise of national reconstruction.

Keywords: higher education, quality in academics, Kothari commission, privatising higher education

Procedia PDF Downloads 440
680 The Impact of Project Management Approaches in Enhancing Entrepreneurial Growth: A Study Using the Theory of Planned Behaviour as a Lens to Understand

Authors: Akunna Agunwah, Kevin Gallimore, Kathryn Kinnmond

Abstract:

Entrepreneurship and project management are widely associated and seen as a vehicle for economic growth, but are studied separately. A few authors have considered the interconnectivity existing between these two fields, but relatively little empirical data currently exist in the literature. The purpose of the present empirical study is to explore whether successful entrepreneurs utilise project management approaches in enhancing enterprise growth by understanding the working practices and experiences of the entrepreneurs’ using the Theory of Planned Behaviour (TPB) as a lens. In order to understand those experiences, ten successful entrepreneurs in various business sectors in the North West of England were interviewed through a face-to-face semi-structured interview method. The collected audio tape-recorded data was transcribed and analysed using the deductive thematic technique (qualitative approach). The themes were viewed through the lens of Theory of Planned Behaviour to identify the three intentional antecedents (attitude, subjective norms, and perceived behavioural control) and to understand how they relate to the project management approaches (Planning, execution, and monitoring). The findings are twofold, the first evidence of the three intentional antecedents, which make up Theory of Planned Behaviour was present. Secondly, the analysis of project management approaches themes (planning, execution, and monitoring) using the lens of the theory of planned behaviour shows evidence of the three intentional antecedents. There were more than one intentional antecedents found in a particular project management theme, which indicates that the entrepreneur does utilise these approaches without categorising them into definite themes. However, the entrepreneur utilised these intentional antecedents as processes to enhanced business growth. In conclusion, the work presented here showed a way of understanding the interconnectivity between entrepreneurship and project management towards enhancing enterprise growth by examining the working practices and experiences of the successful entrepreneurs in the North-West England.

Keywords: business growth, entrepreneurship, project management approaches, theory of planned behaviour

Procedia PDF Downloads 175
679 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 86
678 Artificial Intelligence in the Design of a Retaining Structure

Authors: Kelvin Lo

Abstract:

Nowadays, numerical modelling in geotechnical engineering is very common but sophisticated. Many advanced input settings and considerable computational efforts are required to optimize the design to reduce the construction cost. To optimize a design, it usually requires huge numerical models. If the optimization is conducted manually, there is a potentially dangerous consequence from human errors, and the time spent on the input and data extraction from output is significant. This paper presents an automation process introduced to numerical modelling (Plaxis 2D) of a trench excavation supported by a secant-pile retaining structure for a top-down tunnel project. Python code is adopted to control the process, and numerical modelling is conducted automatically in every 20m chainage along the 200m tunnel, with maximum retained height occurring in the middle chainage. Python code continuously changes the geological stratum and excavation depth under groundwater flow conditions in each 20m section. It automatically conducts trial and error to determine the required pile length and the use of props to achieve the required factor of safety and target displacement. Once the bending moment of the pile exceeds its capacity, it will increase in size. When the pile embedment reaches the default maximum length, it will turn on the prop system. Results showed that it saves time, increases efficiency, lowers design costs, and replaces human labor to minimize error.

Keywords: automation, numerical modelling, Python, retaining structures

Procedia PDF Downloads 22