Search results for: FELA software
3772 AI for Efficient Geothermal Exploration and Utilization
Authors: Velimir Monty Vesselinov, Trais Kliplhuis, Hope Jasperson
Abstract:
Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal
Procedia PDF Downloads 533771 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network
Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour
Abstract:
Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network
Procedia PDF Downloads 1693770 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 1953769 Random Variation of Treated Volumes in Fractionated 2D Image Based HDR Brachytherapy for Cervical Cancer
Authors: R. Tudugala, B. M. A. I. Balasooriya, W. M. Ediri Arachchi, R. W. M. W. K. Rathnayake, T. D. Premaratna
Abstract:
Brachytherapy involves placing a source of radiation near the cancer site which gives promising prognosis for cervical cancer treatments. The purpose of this study was to evaluate the effect of random variation of treated volumes in between fractions in the 2D image based fractionated high dose rate brachytherapy for cervical cancer at National Cancer Institute Maharagama, Sri Lanka. Dose plans were analyzed for 150 cervical cancer patients with orthogonal radiographs (2D) based brachytherapy. ICRU treated volumes was modeled by translating the applicators with the help of “Multisource HDR plus software”. The difference of treated volumes with respect to the applicator geometry was analyzed by using SPSS 18 software; to derived patient population based estimates of delivered treated volumes relative to ideally treated volumes. Packing was evaluated according to bladder dose, rectum dose and geometry of the dose distribution by three consultant radiation oncologist. The difference of treated volumes depends on types of the applicators, which was used in fractionated brachytherapy. The means of the “Difference of Treated Volume” (DTV) for “Evenly activated tandem (ET)” length” group was ((X_1)) -0.48 cm3 and ((X_2)) 11.85 cm3 for “Unevenly activated tandem length (UET) group. The range of the DTV for ET group was 35.80 cm3 whereas UET group 104.80 cm3. One sample T test was performed to compare the DTV with “Ideal treatment volume difference (0.00cm3)”. It is evident that P value was 0.732 for ET group and for UET it was 0.00 moreover independent two sample T test was performed to compare ET and UET groups and calculated P value was 0.005. Packing was evaluated under three categories 59.38% used “Convenient Packing Technique”, 33.33% used “Fairly Packing Technique” and 7.29% used “Not Convenient Packing” in their fractionated brachytherapy treatments. Random variation of treated volume in ET group is much lower than UET group and there is a significant difference (p<0.05) in between ET and UET groups which affects the dose distribution of the treatment. Furthermore, it can be concluded nearly 92.71% patient’s packing were used acceptable packing technique at NCIM, Sri Lanka.Keywords: brachytherapy, cervical cancer, high dose rate, tandem, treated volumes
Procedia PDF Downloads 2013768 The Digital Transformation of Life Insurance Sales in Iran With the Emergence of Personal Financial Planning Robots; Opportunities and Challenges
Authors: Pedram Saadati, Zahra Nazari
Abstract:
Anticipating and identifying future opportunities and challenges facing industry activists for the emergence and entry of new knowledge and technologies of personal financial planning, and providing practical solutions is one of the goals of this research. For this purpose, a future research tool based on receiving opinions from the main players of the insurance industry has been used. The research method in this study was in 4 stages; including 1- a survey of the specialist salesforce of life insurance in order to identify the variables 2- the ranking of the variables by experts selected by a researcher-made questionnaire 3- holding a panel of experts with the aim of understanding the mutual effects of the variables and 4- statistical analyzes of the mutual effects matrix in Mick Mac software is done. The integrated analysis of influencing variables in the future has been done with the method of Structural Analysis, which is one of the efficient and innovative methods of future research. A list of opportunities and challenges was identified through a survey of best-selling life insurance representatives who were selected by snowball sampling. In order to prioritize and identify the most important issues, all the issues raised were sent to selected experts who were selected theoretically through a researcher-made questionnaire. The respondents determined the importance of 36 variables through scoring, so that the prioritization of opportunity and challenge variables can be determined. 8 of the variables identified in the first stage were removed by selected experts, and finally, the number of variables that could be examined in the third stage became 28 variables, which, in order to facilitate the examination, were divided into 6 categories, respectively, 11 variables of organization and management. Marketing and sales 7 cases, social and cultural 6 cases, technological 2 cases, rebranding 1 case and insurance 1 case were divided. The reliability of the researcher-made questionnaire was confirmed with the Cronbach's alpha test value of 0.96. In the third stage, by forming a panel consisting of 5 insurance industry experts, the consensus of their opinions about the influence of factors on each other and the ranking of variables was entered into the matrix. The matrix included the interrelationships of 28 variables, which were investigated using the structural analysis method. By analyzing the data obtained from the matrix by Mic Mac software, the findings of the research indicate that the categories of "correct training in the use of the software, the weakness of the technology of insurance companies in personalizing products, using the approach of equipping the customer, and honesty in declaring no need Customer to Insurance", the most important challenges of the influencer and the categories of "salesforce equipping approach, product personalization based on customer needs assessment, customer's pleasant experience of being consulted with consulting robots, business improvement of the insurance company due to the use of these tools, increasing the efficiency of the issuance process and optimal customer purchase" were identified as the most important opportunities for influence.Keywords: personal financial planning, wealth management, advisor robots, life insurance, digital transformation
Procedia PDF Downloads 463767 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis
Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith
Abstract:
Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.Keywords: information technology, electronic patient records, digitisation, paperless healthcare
Procedia PDF Downloads 923766 Numerical Simulation of Production of Microspheres from Polymer Emulsion in Microfluidic Device toward Using in Drug Delivery Systems
Authors: Nizar Jawad Hadi, Sajad Abd Alabbas
Abstract:
Because of their ability to encapsulate and release drugs in a controlled manner, microspheres fabricated from polymer emulsions using microfluidic devices have shown promise for drug delivery applications. In this study, the effects of velocity, density, viscosity, and surface tension, as well as channel diameter, on microsphere generation were investigated using Fluent Ansys software. The software was programmed with the physical properties of the polymer emulsion such as density, viscosity and surface tension. Simulation will then be performed to predict fluid flow and microsphere production and improve the design of drug delivery applications based on changes in these parameters. The effects of capillary and Weber numbers are also studied. The results of the study showed that the size of the microspheres can be controlled by adjusting the speed and diameter of the channel. Narrower microspheres resulted from narrower channel widths and higher flow rates, which could improve drug delivery efficiency, while smaller microspheres resulted from lower interfacial surface tension. The viscosity and density of the polymer emulsion significantly affected the size of the microspheres, ith higher viscosities and densities producing smaller microspheres. The loading and drug release properties of the microspheres created with the microfluidic technique were also predicted. The results showed that the microspheres can efficiently encapsulate drugs and release them in a controlled manner over a period of time. This is due to the high surface area to volume ratio of the microspheres, which allows for efficient drug diffusion. The ability to tune the manufacturing process using factors such as speed, density, viscosity, channel diameter, and surface tension offers a potential opportunity to design drug delivery systems with greater efficiency and fewer side effects.Keywords: polymer emulsion, microspheres, numerical simulation, microfluidic device
Procedia PDF Downloads 643765 Dynamic Web-Based 2D Medical Image Visualization and Processing Software
Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail
Abstract:
In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN
Procedia PDF Downloads 1603764 A Simple User Administration View of Computing Clusters
Authors: Valeria M. Bastos, Myrian A. Costa, Matheus Ambrozio, Nelson F. F. Ebecken
Abstract:
In this paper a very simple and effective user administration view of computing clusters systems is implemented in order of friendly provide the configuration and monitoring of distributed application executions. The user view, the administrator view, and an internal control module create an illusionary management environment for better system usability. The architecture, properties, performance, and the comparison with others software for cluster management are briefly commented.Keywords: big data, computing clusters, administration view, user view
Procedia PDF Downloads 3313763 A Research on Determining the Viability of a Job Board Website for Refugees in Kenya
Authors: Prince Mugoya, Collins Oduor Ondiek, Patrick Kanyi Wamuyu
Abstract:
Refugee Job Board Website is a web-based application that provides a platform for organizations to post jobs specifically for refugees. Organizations upload job opportunities and refugees can view them on the website. The website also allows refugees to input their skills and qualifications. The methodology used to develop this system is a waterfall (traditional) methodology. Software development tools include Brackets which will be used to code the website and PhpMyAdmin to store all the data in a database.Keywords: information technology, refugee, skills, utilization, economy, jobs
Procedia PDF Downloads 1653762 Precise CNC Machine for Multi-Tasking
Authors: Haroon Jan Khan, Xian-Feng Xu, Syed Nasir Shah, Anooshay Niazi
Abstract:
CNC machines are not only used on a large scale but also now become a prominent necessity among households and smaller businesses. Printed Circuit Boards manufactured by the chemical process are not only risky and unsafe but also expensive and time-consuming. A 3-axis precise CNC machine has been developed, which not only fabricates PCB but has also been used for multi-tasks just by changing the materials used and tools, making it versatile. The advanced CNC machine takes data from CAM software. The TB-6560 controller is used in the CNC machine to adjust variation in the X, Y, and Z axes. The advanced machine is efficient in automatic drilling, engraving, and cutting.Keywords: CNC, G-code, CAD, CAM, Proteus, FLATCAM, Easel
Procedia PDF Downloads 1603761 Numerical Modelling of Prestressed Geogrid Reinforced Soil System
Authors: Soukat Kumar Das
Abstract:
Rapid industrialization and increase in population has resulted in the scarcity of suitable ground conditions. It has driven the need of ground improvement by means of reinforcement with geosynthetics with the minimum possible settlement and with maximum possible safety. Prestressing the geosynthetics offers an economical yet safe method of gaining the goal. Commercially available software PLAXIS 3D has made the analysis of prestressed geosynthetics simpler with much practical simulations of the ground. Attempts have been made so far to analyse the effect of prestressing geosynthetics and the effect of interference of footing on Unreinforced (UR), Geogrid Reinforced (GR) and Prestressed Geogrid Reinforced (PGR) soil on the load bearing capacity and the settlement characteristics of prestressed geogrid reinforced soil using the numerical analysis by using the software PLAXIS 3D. The results of the numerical analysis have been validated and compared with those given in the referred paper. The results have been found to be in very good agreement with those of the actual field values with very small variation. The GR soil has been found to be improve the bearing pressure 240 % whereas the PGR soil improves it by almost 500 % for 1mm settlement. In fact, the PGR soil has enhanced the bearing pressure of the GR soil by almost 200 %. The settlement reduction has also been found to be very significant as for 100 kPa bearing pressure the settlement reduction of the PGR soil has been found to be about 88 % with respect to UR soil and it reduced to up to 67 % with respect to GR soil. The prestressing force has resulted in enhanced reinforcement mechanism, resulting in the increased bearing pressure. The deformation at the geogrid layer has been found to be 13.62 mm for GR soil whereas it decreased down to mere 3.5 mm for PGR soil which certainly ensures the effect of prestressing on the geogrid layer. The parameter Improvement factor or conventionally known as Bearing Capacity Ratio for different settlements and which depicts the improvement of the PGR with respect to UR and GR soil and the improvement of GR soil with respect to UR soil has been found to vary in the range of 1.66-2.40 in the present analysis for GR soil and was found to be vary between 3.58 and 5.12 for PGR soil with respect to UR soil. The effect of prestressing was also observed in case of two interfering square footings. The centre to centre distance between the two footings (SFD) was taken to be B, 1.5B, 2B, 2.5B and 3B where B is the width of the footing. It was found that for UR soil the improvement of the bearing pressure was up to 1.5B after which it remained almost same. But for GR soil the zone of influence rose up to 2B and for PGR it further went up to 2.5B. So the zone of interference for PGR soil has increased by 67% than Unreinforced (UR) soil and almost 25 % with respect to GR soil.Keywords: bearing, geogrid, prestressed, reinforced
Procedia PDF Downloads 4023760 Transforming Data Science Curriculum Through Design Thinking
Authors: Samar Swaid
Abstract:
Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.Keywords: data science, design thinking, AI, currculum, transformation
Procedia PDF Downloads 813759 Using Computer Simulations to Prepare Teachers
Authors: Roberta Gentry
Abstract:
The presentation will begin with a brief literature review of the use of computer simulation in teacher education programs. This information will be summarized. Additionally, based on the literature review, advantages and disadvantages of using computer simulation in higher education will be shared. Finally, a study in which computer simulations software was used with 50 initial licensure teacher candidates in both an introductory course and a behavior management course will be shared. Candidates reflected on their experiences with using computer simulation. The instructor of the course will also share lessons learned.Keywords: simulations, teacher education, teacher preparation, educational research
Procedia PDF Downloads 6503758 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes
Authors: Misra Ayse Adsiz, Selim Selvi
Abstract:
In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.Keywords: agile, design, missile, scrum
Procedia PDF Downloads 1683757 Measurement and Modelling of HIV Epidemic among High Risk Groups and Migrants in Two Districts of Maharashtra, India: An Application of Forecasting Software-Spectrum
Authors: Sukhvinder Kaur, Ashok Agarwal
Abstract:
Background: For the first time in 2009, India was able to generate estimates of HIV incidence (the number of new HIV infections per year). Analysis of epidemic projections helped in revealing that the number of new annual HIV infections in India had declined by more than 50% during the last decade (GOI Ministry of Health and Family Welfare, 2010). Then, National AIDS Control Organisation (NACO) planned to scale up its efforts in generating projections through epidemiological analysis and modelling by taking recent available sources of evidence such as HIV Sentinel Surveillance (HSS), India Census data and other critical data sets. Recently, NACO generated current round of HIV estimates-2012 through globally recommended tool “Spectrum Software” and came out with the estimates for adult HIV prevalence, annual new infections, number of people living with HIV, AIDS-related deaths and treatment needs. State level prevalence and incidence projections produced were used to project consequences of the epidemic in spectrum. In presence of HIV estimates generated at state level in India by NACO, USIAD funded PIPPSE project under the leadership of NACO undertook the estimations and projections to district level using same Spectrum software. In 2011, adult HIV prevalence in one of the high prevalent States, Maharashtra was 0.42% ahead of the national average of 0.27%. Considering the heterogeneity of HIV epidemic between districts, two districts of Maharashtra – Thane and Mumbai were selected to estimate and project the number of People-Living-with-HIV/AIDS (PLHIV), HIV-prevalence among adults and annual new HIV infections till 2017. Methodology: Inputs in spectrum included demographic data from Census of India since 1980 and sample registration system, programmatic data on ‘Alive and on ART (adult and children)’,‘Mother-Baby pairs under PPTCT’ and ‘High Risk Group (HRG)-size mapping estimates’, surveillance data from various rounds of HSS, National Family Health Survey–III, Integrated Biological and Behavioural Assessment and Behavioural Sentinel Surveillance. Major Findings: Assuming current programmatic interventions in these districts, an estimated decrease of 12% points in Thane and 31% points in Mumbai among new infections in HRGs and migrants is observed from 2011 by 2017. Conclusions: Project also validated decrease in HIV new infection among one of the high risk groups-FSWs using program cohort data since 2012 to 2016. Though there is a decrease in HIV prevalence and new infections in Thane and Mumbai, further decrease is possible if appropriate programme response, strategies and interventions are envisaged for specific target groups based on this evidence. Moreover, evidence need to be validated by other estimation/modelling techniques; and evidence can be generated for other districts of the state, where HIV prevalence is high and reliable data sources are available, to understand the epidemic within the local context.Keywords: HIV sentinel surveillance, high risk groups, projections, new infections
Procedia PDF Downloads 2113756 Numerical Simulation of Precast Concrete Panels for Airfield Pavement
Authors: Josef Novák, Alena Kohoutková, Vladimír Křístek, Jan Vodička
Abstract:
Numerical analysis software belong to the main tools for simulating the real behavior of various concrete structures and elements. In comparison with experimental tests, they offer an affordable way to study the mechanical behavior of structures under various conditions. The contribution deals with a precast element of an innovative airfield pavement system which is being developed within an ongoing scientific project. The proposed system consists a two-layer surface course of precast concrete panels positioned on a two-layer base of fiber-reinforced concrete with recycled aggregate. As the panels are supposed to be installed directly on the hardened base course, imperfections at the interface between the base course and surface course are expected. Considering such circumstances, three various behavior patterns could be established and considered when designing the precast element. Enormous costs of full-scale experiments force to simulate the behavior of the element in a numerical analysis software using finite element method. The simulation was conducted on a nonlinear model in order to obtain such results which could fully compensate results from the experiments. First, several loading schemes were considered with the aim to observe the critical one which was used for the simulation later on. The main objective of the simulation was to optimize reinforcement of the element subject to quasi-static loading from airplanes. When running the simulation several parameters were considered. Namely, it concerns geometrical imperfections, manufacturing imperfections, stress state in reinforcement, stress state in concrete and crack width. The numerical simulation revealed that the precast element should be heavily reinforced to fulfill all the demands assumed. The main cause of using high amount of reinforcement is the size of the imperfections which could occur at real structure. Improving manufacturing quality, the installation of the precast panels on a fresh base course or using a bedding layer underneath the surface course belong to the main steps how to reduce the size of imperfections and consequently lower the consumption of reinforcement.Keywords: nonlinear analysis, numerical simulation, precast concrete, pavement
Procedia PDF Downloads 2563755 The Relationship between Personal, Psycho-Social and Occupational Risk Factors with Low Back Pain Severity in Industrial Workers
Authors: Omid Giahi, Ebrahim Darvishi, Mahdi Akbarzadeh
Abstract:
Introduction: Occupational low back pain (LBP) is one of the most prevalent work-related musculoskeletal disorders in which a lot of risk factors are involved that. The present study focuses on the relation between personal, psycho-social and occupational risk factors and LBP severity in industrial workers. Materials and Methods: This research was a case-control study which was conducted in Kurdistan province. 100 workers (Mean Age ± SD of 39.9 ± 10.45) with LBP were selected as the case group, and 100 workers (Mean Age ± SD of 37.2 ± 8.5) without LBP were assigned into the control group. All participants were selected from various industrial units, and they had similar occupational conditions. The required data including demographic information (BMI, smoking, alcohol, and family history), occupational (posture, mental workload (MWL), force, vibration and repetition), and psychosocial factors (stress, occupational satisfaction and security) of the participants were collected via consultation with occupational medicine specialists, interview, and the related questionnaires and also the NASA-TLX software and REBA worksheet. Chi-square test, logistic regression and structural equation modeling (SEM) were used to analyze the data. For analysis of data, IBM Statistics SPSS 24 and Mplus6 software have been used. Results: 114 (77%) of the individuals were male and 86 were (23%) female. Mean Career length of the Case Group and Control Group were 10.90 ± 5.92, 9.22 ± 4.24, respectively. The statistical analysis of the data revealed that there was a significant correlation between the Posture, Smoking, Stress, Satisfaction, and MWL with occupational LBP. The odds ratios (95% confidence intervals) derived from a logistic regression model were 2.7 (1.27-2.24) and 2.5 (2.26-5.17) and 3.22 (2.47-3.24) for Stress, MWL, and Posture, respectively. Also, the SEM analysis of the personal, psycho-social and occupational factors with LBP revealed that there was a significant correlation. Conclusion: All three broad categories of risk factors simultaneously increase the risk of occupational LBP in the workplace. But, the risks of Posture, Stress, and MWL have a major role in LBP severity. Therefore, prevention strategies for persons in jobs with high risks for LBP are required to decrease the risk of occupational LBP.Keywords: industrial workers occupational, low back pain, occupational risk factors, psychosocial factors
Procedia PDF Downloads 2583754 Use of Coconut Shell as a Replacement of Normal Aggregates in Rigid Pavements
Authors: Prakash Parasivamurthy, Vivek Rama Das, Ravikant Talluri, Veena Jawali
Abstract:
India ranks among third in the production of coconut besides Philippines and Indonesia. About 92% of the total production in the country is contributed from four southern states especially, Kerala (45.22%), Tamil Nadu (26.56%), Karnataka (10.85%), and Andhra Pradesh (8.93%). Other states, such as Goa, Maharashtra, Odisha, West Bengal, and those in the northeast (Tripura and Assam) account for the remaining 8.44%. The use of coconut shell as coarse aggregate in concrete has never been a usual practice in the industry, particularly in areas where light weight concrete is required for non-load bearing walls, non-structural floors, and strip footings. The high cost of conventional building materials is a major factor affecting construction delivery in India. In India, where abundant agricultural and industrial wastes are discharged, these wastes can be used as potential material or replacement material in the construction industry. This will have double the advantages viz., reduction in the cost of construction material and also as a means of disposal of wastes. Therefore, an attempt has been made in this study to utilize the coconut shell (CS) as coarse aggregate in rigid pavement. The present study was initiated with the characterization of materials by the basic material testing. The casted moulds are cured and tests are conducted for hardened concrete. The procedure is continued with determination of fck (Characteristic strength), E (Modulus of Elasticity) and µ (Poisson Value) by the test results obtained. For the analytical studies, rigid pavement was modeled by the KEN PAVE software, finite element software developed specially for road pavements and simultaneously design of rigid pavement was carried out with Indian standards. Results show that physical properties of CSAC (Coconut Shell Aggregate Concrete) with 10% replacement gives better results. The flexural strength of CSAC is found to increase by 4.25% as compared to control concrete. About 13 % reduction in pavement thickness is observed using optimum coconut shell.Keywords: coconut shell, rigid pavement, modulus of elasticity, poison ratio
Procedia PDF Downloads 2373753 Analysis of Barbell Kinematics of Snatch Technique among Women Weightlifters in India
Authors: Manish Kumar Pillai, Madhavi Pathak Pillai, Rajender Lal, Dinesh P. Sharma
Abstract:
India has not yet been able to produce many weightlifters in the past years. Karnam Malleshwari is the only woman to win a medal for India in Olympics. When we try to introspect, there seem to be different reasons. One of the probable cause could be the lack of biomechanical analysis for technique improvements. The analysis of motion in sports has gained prime importance for technical improvement. It helps an athlete to develop a better understanding of his own skills and increasing the rate of technical learning process. Kinematics is concerned with describing and quantifying both the linear and angular position of bodies and their time derivatives. The techniques analysis of barbell movement is very important in weightlifting. But women weightlifting has a shorter history than men’s. Research on women weightlifting based on video analysis is less; there is a lack of scientific evidence based on kinematic analysis of especially on Indian weightlifters at national level are limited. Hence, the present investigation was aimed to analyze the barbell kinematics of women weightlifters in India. The study was delimited to the medal winners of 69-kilogram weight category in the All India Inter-University Competition, age ranging between 18 and 28 years. The variables selected for the mechanical analysis of Barbell kinematics included barbell trajectory, velocity, acceleration, potential energy, kinetic energy, mechanical energy, and average power output. The performance was captured during the competition by two DV PC-60 Digital cameras (Panasonic Company, Ltd). Two cameras were placed 6-meters perpendicular to the plane of the motion, 130 cm. above the ground to record/capture the frontal and lateral view of the lifters simultaneously. Video recordings were analyzed by using Dartfish software, and barbell kinematics were analyzed with the information derived with the help of software. The result documented on the basis of the finding of the study clearly states that there are differences in the selected kinematic variables in all three lifters in respect to their technique in five phases during snatch technique using by them.Keywords: dartfish, digital camera, kinematic, snatch, weightlifting
Procedia PDF Downloads 1363752 Neighbor Caring Environment System (NCE) Using Parallel Replication Mechanism
Authors: Ahmad Shukri Mohd Noor, Emma Ahmad Sirajudin, Rabiei Mamat
Abstract:
Pertaining to a particular Marine interest, the process of data sampling could take years before a study can be concluded. Therefore, the need for a robust backup system for the data is invariably implicit. In recent advancement of Marine applications, more functionalities and tools are integrated to assist the work of the researchers. It is anticipated that this modality will continue as research scope widens and intensifies and at the same to follow suit with current technologies and lifestyles. The convenience to collect and share information these days also applies to the work in Marine research. Therefore, Marine system designers should be aware that high availability is a necessary attribute in Marine repository applications as well as a robust backup system for the data. In this paper, the approach to high availability is related both to hardware and software but the focus is more on software. We consider a NABTIC repository system that is primitively built on a single server and does not have replicated components. First, the system is decomposed into separate modules. The modules are placed on multiple servers to create a distributed system. Redundancy is added by placing the copies of the modules on different servers using Neighbor Caring Environment System(NCES) technique. NCER is utilizing parallel replication components mechanism. A background monitoring is established to check servers’ heartbeats to confirm their aliveness. At the same time, a critical adaptive threshold is maintained to make sure a failure is timely detected using Adaptive Fault Detection (AFD). A confirmed failure will set the recovery mode where a selection process will be done before a fail-over server is instructed. In effect, the Marine repository service is continued as the fail-over masks a recent failure. The performance of the new prototype is tested and is confirmed to be more highly available. Furthermore, the downtime is not noticeable as service is immediately restored automatically. The Marine repository system is said to have achieved fault tolerance.Keywords: availability, fault detection, replication, fault tolerance, marine application
Procedia PDF Downloads 3213751 Expression of PGC-1 Alpha Isoforms in Response to Eccentric and Concentric Resistance Training in Healthy Subjects
Authors: Pejman Taghibeikzadehbadr
Abstract:
Background and Aim: PGC-1 alpha is a transcription factor that was first detected in brown adipose tissue. Since its discovery, PGC-1 alpha has been known to facilitate beneficial adaptations such as mitochondrial biogenesis and increased angiogenesis in skeletal muscle following aerobic exercise. Therefore, the purpose of this study was to investigate the expression of PGC-1 alpha isoforms in response to eccentric and concentric resistance training in healthy subjects. Materials and Methods: Ten healthy men were randomly divided into two groups (5 patients in eccentric group - 5 in eccentric group). Isokinetic contraction protocols included eccentric and concentric knee extension with maximum power and angular velocity of 60 degrees per second. The torques assigned to each subject were considered to match the workload in both protocols, with a rotational speed of 60 degrees per second. Contractions consisted of a maximum of 12 sets of 10 repetitions for the right leg, a rest time of 30 seconds between each set. At the beginning and end of the study, biopsy of the lateral broad muscle tissue was performed. Biopsies were performed in both distal and proximal directions of the lateral flank. To evaluate the expression of PGC1α-1 and PGC1α-4 genes, tissue analysis was performed in each group using Real-Time PCR technique. Data were analyzed using dependent t-test and covariance test. SPSS21 software and Exell 2013 software were used for data analysis. Results: The results showed that intra-group changes of PGC1α-1 after one session of activity were not significant in eccentric (p = 0.168) and concentric (p = 0.959) groups. Also, inter-group changes showed no difference between the two groups (p = 0.681). Also, intra-group changes of PGC1α-4 after one session of activity were significant in an eccentric group (p = 0.012) and concentric group (p = 0.02). Also, inter-group changes showed no difference between the two groups (p = 0.362). Conclusion: It seems that the lack of significant changes in the desired variables due to the lack of exercise pressure is sufficient to stimulate the increase of PGC1α-1 and PGC1α-4. And with regard to reviewing the answer, it seems that the compatibility debate has different results that need to be addressed.Keywords: eccentric contraction, concentric contraction, PGC1α-1 و PGC1α-4, human subject
Procedia PDF Downloads 783750 Computation of Stress Intensity Factor Using Extended Finite Element Method
Authors: Mahmoudi Noureddine, Bouregba Rachid
Abstract:
In this paper the stress intensity factors of a slant-cracked plate of AISI 304 stainless steel, have been calculated using extended finite element method and finite element method (FEM) in ABAQUS software, the results were compared with theoretical values.Keywords: stress intensity factors, extended finite element method, stainless steel, abaqus
Procedia PDF Downloads 6183749 A System Architecture for Hand Gesture Control of Robotic Technology: A Case Study Using a Myo™ Arm Band, DJI Spark™ Drone, and a Staubli™ Robotic Manipulator
Authors: Sebastian van Delden, Matthew Anuszkiewicz, Jayse White, Scott Stolarski
Abstract:
Industrial robotic manipulators have been commonplace in the manufacturing world since the early 1960s, and unmanned aerial vehicles (drones) have only begun to realize their full potential in the service industry and the military. The omnipresence of these technologies in their respective fields will only become more potent in coming years. While these technologies have greatly evolved over the years, the typical approach to human interaction with these robots has not. In the industrial robotics realm, a manipulator is typically jogged around using a teach pendant and programmed using a networked computer or the teach pendant itself via a proprietary software development platform. Drones are typically controlled using a two-handed controller equipped with throttles, buttons, and sticks, an app that can be downloaded to one’s mobile device, or a combination of both. This application-oriented work offers a novel approach to human interaction with both unmanned aerial vehicles and industrial robotic manipulators via hand gestures and movements. Two systems have been implemented, both of which use a Myo™ armband to control either a drone (DJI Spark™) or a robotic arm (Stäubli™ TX40). The methodologies developed by this work present a mapping of armband gestures (fist, finger spread, swing hand in, swing hand out, swing arm left/up/down/right, etc.) to either drone or robot arm movements. The findings of this study present the efficacy and limitations (precision and ergonomic) of hand gesture control of two distinct types of robotic technology. All source code associated with this project will be open sourced and placed on GitHub. In conclusion, this study offers a framework that maps hand and arm gestures to drone and robot arm control. The system has been implemented using current ubiquitous technologies, and these software artifacts will be open sourced for future researchers or practitioners to use in their work.Keywords: human robot interaction, drones, gestures, robotics
Procedia PDF Downloads 1573748 Improving the Uniformity of Electrostatic Meter’s Spatial Sensitivity
Authors: Mohamed Abdalla, Ruixue Cheng, Jianyong Zhang
Abstract:
In pneumatic conveying, the solids are mixed with air or gas. In industries such as coal fired power stations, blast furnaces for iron making, cement and flour processing, the mass flow rate of solids needs to be monitored or controlled. However the current gas-solids two-phase flow measurement techniques are not as accurate as the flow meters available for the single phase flow. One of the problems that the multi-phase flow meters to face is that the flow profiles vary with measurement locations and conditions of pipe routing, bends, elbows and other restriction devices in conveying system as well as conveying velocity and concentration. To measure solids flow rate or concentration with non-even distribution of solids in gas, a uniform spatial sensitivity is required for a multi-phase flow meter. However, there are not many meters inherently have such property. The circular electrostatic meter is a popular choice for gas-solids flow measurement with its high sensitivity to flow, robust construction, low cost for installation and non-intrusive nature. However such meters have the inherent non-uniform spatial sensitivity. This paper first analyses the spatial sensitivity of circular electrostatic meter in general and then by combining the effect of the sensitivity to a single particle and the sensing volume for a given electrode geometry, the paper reveals first time how a circular electrostatic meter responds to a roping flow stream, which is much more complex than what is believed at present. The paper will provide the recent research findings on spatial sensitivity investigation at the University of Tees side based on Finite element analysis using Ansys Fluent software, including time and frequency domain characteristics and the effect of electrode geometry. The simulation results will be compared tothe experimental results obtained on a large scale (14” diameter) rig. The purpose of this research is paving a way to achieve a uniform spatial sensitivity for the circular electrostatic sensor by mean of compensation so as to improve overall accuracy of gas-solids flow measurement.Keywords: spatial sensitivity, electrostatic sensor, pneumatic conveying, Ansys Fluent software
Procedia PDF Downloads 3673747 Analysis of Plates with Varying Rigidities Using Finite Element Method
Authors: Karan Modi, Rajesh Kumar, Jyoti Katiyar, Shreya Thusoo
Abstract:
This paper presents Finite Element Method (FEM) for analyzing the internal responses generated in thin rectangular plates with various edge conditions and rigidity conditions. Comparison has been made between the FEM (ANSYS software) results for displacement, stresses and moments generated with and without the consideration of hole in plate and different aspect ratios. In the end comparison for responses in plain and composite square plates has been studied.Keywords: ANSYS, finite element method, plates, static analysis
Procedia PDF Downloads 4533746 Classification of Forest Types Using Remote Sensing and Self-Organizing Maps
Authors: Wanderson Goncalves e Goncalves, José Alberto Silva de Sá
Abstract:
Human actions are a threat to the balance and conservation of the Amazon forest. Therefore the environmental monitoring services play an important role as the preservation and maintenance of this environment. This study classified forest types using data from a forest inventory provided by the 'Florestal e da Biodiversidade do Estado do Pará' (IDEFLOR-BIO), located between the municipalities of Santarém, Juruti and Aveiro, in the state of Pará, Brazil, covering an area approximately of 600,000 hectares, Bands 3, 4 and 5 of the TM-Landsat satellite image, and Self - Organizing Maps. The information from the satellite images was extracted using QGIS software 2.8.1 Wien and was used as a database for training the neural network. The midpoints of each sample of forest inventory have been linked to images. Later the Digital Numbers of the pixels have been extracted, composing the database that fed the training process and testing of the classifier. The neural network was trained to classify two forest types: Rain Forest of Lowland Emerging Canopy (Dbe) and Rain Forest of Lowland Emerging Canopy plus Open with palm trees (Dbe + Abp) in the Mamuru Arapiuns glebes of Pará State, and the number of examples in the training data set was 400, 200 examples for each class (Dbe and Dbe + Abp), and the size of the test data set was 100, with 50 examples for each class (Dbe and Dbe + Abp). Therefore, total mass of data consisted of 500 examples. The classifier was compiled in Orange Data Mining 2.7 Software and was evaluated in terms of the confusion matrix indicators. The results of the classifier were considered satisfactory, and being obtained values of the global accuracy equal to 89% and Kappa coefficient equal to 78% and F1 score equal to 0,88. It evaluated also the efficiency of the classifier by the ROC plot (receiver operating characteristics), obtaining results close to ideal ratings, showing it to be a very good classifier, and demonstrating the potential of this methodology to provide ecosystem services, particularly in anthropogenic areas in the Amazon.Keywords: artificial neural network, computational intelligence, pattern recognition, unsupervised learning
Procedia PDF Downloads 3613745 Readiness of Iran’s Insurance Industry Salesforce to Accept Changing to Become Islamic Personal Financial Planners
Authors: Pedram Saadati, Zahra Nazari
Abstract:
Today, the role and importance of financial technology businesses in Iran have increased significantly. Although, in Iran, there is no Islamic or non-Islamic personal financial planning field of study in the universities or educational centers, the profession of personal financial planning is not defined, and there is no software introduced in this regard for advisors or consumers. The largest sales network of financial services in Iran belongs to the insurance industry, and there is an untapped market for international companies in Iran that can contribute to 130 thousand representatives in the insurance industry and 28 million families by providing training and personal financial advisory software. To the best of the author's knowledge, despite the lack of previous internal studies in this field, the present study investigates the level of readiness of the salesforce of the insurance industry to accept this career and its technology. The statistical population of the research is made up of managers, insurance sales representatives, assistants and heads of sales departments of insurance companies. An 18-minute video was prepared that introduced and taught the job of Islamic personal financial planning and explained its difference from its non-Islamic model. This video was provided to the respondents. The data collection tool was a research-made questionnaire. To investigate the factors affecting technology acceptance and job change, independent T descriptive statistics and Pearson correlation were used, and Friedman's test was used to rank the effective factors. The results indicate the mental perception and very positive attitude of the insurance industry activists towards the usefulness of this job and its technology, and the studied sample confirmed the intention of training in this knowledge. Based on research results, the change in the customer's attitude towards the insurance advisor and the possibility of increasing income are considered as the reasons for accepting. However, Restrictions on using investment opportunities due to Islamic financial services laws and the uncertainty of the position of the central insurance in this regard are considered as the most important obstacles.Keywords: fintech, insurance, personal financial planning, wealth management
Procedia PDF Downloads 493744 AMBICOM: An Ambient Computing Middleware Architecture for Heterogeneous Environments
Authors: Ekrem Aksoy, Nihat Adar, Selçuk Canbek
Abstract:
Ambient Computing or Ambient Intelligence (AmI) is emerging area in computer science aiming to create intelligently connected environments and Internet of Things. In this paper, we propose communication middleware architecture for AmI. This middleware architecture addresses problems of communication, networking, and abstraction of applications, although there are other aspects (e.g. HCI and Security) within general AmI framework. Within this middleware architecture, any application developer might address HCI and Security issues with extensibility features of this platform.Keywords: AmI, ambient computing, middleware, distributed-systems, software-defined networking
Procedia PDF Downloads 2843743 Multi-Agent System Based Solution for Operating Agile and Customizable Micro Manufacturing Systems
Authors: Dylan Santos De Pinho, Arnaud Gay De Combes, Matthieu Steuhlet, Claude Jeannerat, Nabil Ouerhani
Abstract:
The Industry 4.0 initiative has been launched to address huge challenges related to ever-smaller batch sizes. The end-user need for highly customized products requires highly adaptive production systems in order to keep the same efficiency of shop floors. Most of the classical Software solutions that operate the manufacturing processes in a shop floor are based on rigid Manufacturing Execution Systems (MES), which are not capable to adapt the production order on the fly depending on changing demands and or conditions. In this paper, we present a highly modular and flexible solution to orchestrate a set of production systems composed of a micro-milling machine-tool, a polishing station, a cleaning station, a part inspection station, and a rough material store. The different stations are installed according to a novel matrix configuration of a 3x3 vertical shelf. The different cells of the shelf are connected through horizontal and vertical rails on which a set of shuttles circulate to transport the machined parts from a station to another. Our software solution for orchestrating the tasks of each station is based on a Multi-Agent System. Each station and each shuttle is operated by an autonomous agent. All agents communicate with a central agent that holds all the information about the manufacturing order. The core innovation of this paper lies in the path planning of the different shuttles with two major objectives: 1) reduce the waiting time of stations and thus reduce the cycle time of the entire part, and 2) reduce the disturbances like vibration generated by the shuttles, which highly impacts the manufacturing process and thus the quality of the final part. Simulation results show that the cycle time of the parts is reduced by up to 50% compared with MES operated linear production lines while the disturbance is systematically avoided for the critical stations like the milling machine-tool.Keywords: multi-agent systems, micro-manufacturing, flexible manufacturing, transfer systems
Procedia PDF Downloads 130