Search results for: CHIC Analysis V 1.1 Software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30429

Search results for: CHIC Analysis V 1.1 Software

29919 Optimising Apparel Digital Production in Industrial Clusters

Authors: Minji Seo

Abstract:

Fashion stakeholders are becoming increasingly aware of technological innovation in manufacturing. In 2020, the COVID-19 pandemic caused transformations in working patterns, such as working remotely rather thancommuting. To enable smooth remote working, 3D fashion design software is being adoptedas the latest trend in design and production. The majority of fashion designers, however, are still resistantto this change. Previous studies on 3D fashion design software solely highlighted the beneficial and detrimental factors of adopting design innovations. They lacked research on the relationship between resistance factors and the adoption of innovation. These studies also fell short of exploringthe perspectives of users of these innovations. This paper aims to investigate the key drivers and barriers of employing 3D fashion design software as wellas to explore the challenges faced by designers.It also toucheson the governmental support for digital manufacturing in Seoul, South Korea, and London, the United Kingdom. By conceptualising local support, this study aims to provide a new path for industrial clusters to optimise digital apparel manufacturing. The study uses a mixture of quantitative and qualitative approaches. Initially, it reflects a survey of 350 samples, fashion designers, on innovation resistance factors of 3D fashion design software and the effectiveness of local support. In-depth interviews with 30 participants provide a better understanding of designers’ aspects of the benefits and obstacles of employing 3D fashion design software. The key findings of this research are the main barriers to employing 3D fashion design software in fashion production. The cultural characteristics and interviews resultsare used to interpret the survey results. The findings of quantitative data examine the main resistance factors to adopting design innovations. The dominant obstacles are: the cost of software and its complexity; lack of customers’ interest in innovation; lack of qualified personnel, and lack of knowledge. The main difference between Seoul and London is the attitudes towards government support. Compared to the UK’s fashion designers, South Korean designers emphasise that government support is highly relevant to employing 3D fashion design software. The top-down and bottom-up policy implementation approach distinguishes the perception of government support. Compared to top-down policy approaches in South Korea, British fashion designers based on employing bottom-up approaches are reluctant to receive government support. The findings of this research will contribute to generating solutions for local government and the optimisation of use of 3D fashion design software in fashion industrial clusters.

Keywords: digital apparel production, industrial clusters, innovation resistance, 3D fashion design software, manufacturing, innovation, technology, digital manufacturing, innovative fashion design process

Procedia PDF Downloads 102
29918 Detection of Image Blur and Its Restoration for Image Enhancement

Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad

Abstract:

Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.

Keywords: image enhancement, motion analysis, motion detection, motion estimation

Procedia PDF Downloads 287
29917 Methods Used to Perform Requirements Elicitation for Healthcare Software Development

Authors: Tang Jiacheng, Fang Tianyu, Liu Yicen, Xiang Xingzhou

Abstract:

The proportion of healthcare services is increasing throughout the globe. The convergence of mobile technology is driving new business opportunities, innovations in healthcare service delivery and the promise of a better life tomorrow for different populations with various healthcare needs. One of the most important phases for the combination of health care and mobile applications is to elicit requirements correctly. In this paper, four articles from different research directions with four topics on healthcare were detailed analyzed and summarized. We identified the underlying problems in guidance to develop mobile applications to provide healthcare service for Older adults, Women in menopause, Patients undergoing covid. These case studies cover several elicitation methods: survey, prototyping, focus group interview and questionnaire. And the effectiveness of these methods was analyzed along with the advantages and limitations of these methods, which is beneficial to adapt the elicitation methods for future software development process.

Keywords: healthcare, software requirement elicitation, mobile applications, prototyping, focus group interview

Procedia PDF Downloads 148
29916 Vibration Measurements of Single-Lap Cantilevered SPR Beams

Authors: Xiaocong He

Abstract:

Self-pierce riveting (SPR) is a new high-speed mechanical fastening technique which is suitable for point joining dissimilar sheet materials, as well as coated and pre-painted sheet materials. Mechanical structures assembled by SPR are expected to possess a high damping capacity. In this study, experimental measurement techniques were proposed for the prediction of vibration behavior of single-lap cantilevered SPR beams. The dynamic test software and the data acquisition hardware were used in the experimental measurement of the dynamic response of the single-lap cantilevered SPR beams. Free and forced vibration behavior of the single-lap cantilevered SPR beams was measured using the LMS CADA-X experimental modal analysis software and the LMS-DIFA Scadas II data acquisition hardware. The frequency response functions of the SPR beams of different rivet number were compared. The main goal of the paper is to provide a basic measuring method for further research on vibration based non-destructive damage detection in single-lap cantilevered SPR beams.

Keywords: self-piercing riveting, dynamic response, experimental measurement, frequency response functions

Procedia PDF Downloads 429
29915 A Survey on the Status of Test Automation

Authors: Andrei Contan, Richard Torkar

Abstract:

Aim: The process of test automation and its practices in industry have to be better understood, both for the industry itself and for the research community. Method: We conducted a quantitative industry survey by asking IT professionals to answer questions related to the area of test automation. Results: Test automation needs and practices vary greatly between organizations at different stages of the software development life cycle. Conclusions: Most of the findings are general test automation challenges and are specific to small- to medium-sized companies, developing software applications in the web, desktop or mobile domain.

Keywords: survey, testing, test automation, status of test automation

Procedia PDF Downloads 657
29914 A Small Graphic Lie. The Photographic Quality of Pierre Bourdieu’s Correspondance Analysis

Authors: Lene Granzau Juel-Jacobsen

Abstract:

The problem of beautification is an obvious concern of photography, claiming reference to reality, but it also lies at the very heart of social theory. As we become accustomed to sophisticated visualizations of statistical data in pace with the development of software programs, we should not only be inclined to ask new types of research questions, but we also need to confront social theories based on such visualization techniques with new types of questions. Correspondence Analysis, GIS analysis, Social Network Analysis, and Perceptual Maps are current examples of visualization techniques popular within the social sciences and neighboring disciplines. This article discusses correspondence analysis, arguing that the graphic plot of correspondence analysis is to be interpreted much similarly to a photograph. It refers no more evidently or univocally to reality than a photograph, representing social life no more truthfully than a photograph documents. Pierre Bourdieu’s theoretical corpus, especially his theory of fields, relies heavily on correspondence analysis. While much attention has been directed towards critiquing the somewhat vague conceptualization of habitus, limited focus has been placed on the equally problematic concepts of social space and field. Based on a re-reading of the Distinction, the article argues that the concepts rely on ‘a small graphic lie’ very similar to a photograph. Like any other piece of art, as Bourdieu himself recognized, the graphic display is a politically and morally loaded representation technique. However, the correspondence analysis does not necessarily serve the purpose he intended. In fact, it tends towards the pitfalls he strove to overcome.

Keywords: datavisualization, correspondance analysis, bourdieu, Field, visual representation

Procedia PDF Downloads 68
29913 Implementation of Big Data Concepts Led by the Business Pressures

Authors: Snezana Savoska, Blagoj Ristevski, Violeta Manevska, Zlatko Savoski, Ilija Jolevski

Abstract:

Big data is widely accepted by the pharmaceutical companies as a result of business demands create through legal pressure. Pharmaceutical companies have many legal demands as well as standards’ demands and have to adapt their procedures to the legislation. To manage with these demands, they have to standardize the usage of the current information technology and use the latest software tools. This paper highlights some important aspects of experience with big data projects implementation in a pharmaceutical Macedonian company. These projects made improvements of their business processes by the help of new software tools selected to comply with legal and business demands. They use IT as a strategic tool to obtain competitive advantage on the market and to reengineer the processes towards new Internet economy and quality demands. The company is required to manage vast amounts of structured as well as unstructured data. For these reasons, they implement projects for emerging and appropriate software tools which have to deal with big data concepts accepted in the company.

Keywords: big data, unstructured data, SAP ERP, documentum

Procedia PDF Downloads 271
29912 Comprehensive Analysis and Optimization of Alkaline Water Electrolysis for Green Hydrogen Production: Experimental Validation, Simulation Study, and Cost Analysis

Authors: Umair Ahmed, Muhammad Bin Irfan

Abstract:

This study focuses on designing and optimization of an alkaline water electrolyser for the production of green hydrogen. The aim is to enhance the durability and efficiency of this technology while simultaneously reducing the cost associated with the production of green hydrogen. The experimental results obtained from the alkaline water electrolyser are compared with simulated results using Aspen Plus software, allowing a comprehensive analysis and evaluation. To achieve the aforementioned goals, several design and operational parameters are investigated. The electrode material, electrolyte concentration, and operating conditions are carefully selected to maximize the efficiency and durability of the electrolyser. Additionally, cost-effective materials and manufacturing techniques are explored to decrease the overall production cost of green hydrogen. The experimental setup includes a carefully designed alkaline water electrolyser, where various performance parameters (such as hydrogen production rate, current density, and voltage) are measured. These experimental results are then compared with simulated data obtained using Aspen Plus software. The simulation model is developed based on fundamental principles and validated against the experimental data. The comparison between experimental and simulated results provides valuable insight into the performance of an alkaline water electrolyser. It helps to identify the areas where improvements can be made, both in terms of design and operation, to enhance the durability and efficiency of the system. Furthermore, the simulation results allow cost analysis providing an estimate of the overall production cost of green hydrogen. This study aims to develop a comprehensive understanding of alkaline water electrolysis technology. The findings of this research can contribute to the development of more efficient and durable electrolyser technology while reducing the cost associated with this technology. Ultimately, these advancements can pave the way for a more sustainable and economically viable hydrogen economy.

Keywords: sustainable development, green energy, green hydrogen, electrolysis technology

Procedia PDF Downloads 89
29911 Flow Conservation Framework for Monitoring Software Defined Networks

Authors: Jesús Antonio Puente Fernández, Luis Javier Garcia Villalba

Abstract:

New trends on streaming videos such as series or films require a high demand of network resources. This fact results in a huge problem within traditional IP networks due to the rigidity of its architecture. In this way, Software Defined Networks (SDN) is a new concept of network architecture that intends to be more flexible and it simplifies the management in networks with respect to the existing ones. These aspects are possible due to the separation of control plane (controller) and data plane (switches). Taking the advantage of this separated control, it is easy to deploy a monitoring tool independent of device vendors since the existing ones are dependent on the installation of specialized and expensive hardware. In this paper, we propose a framework that optimizes the traffic monitoring in SDN networks that decreases the number of monitoring queries to improve the network traffic and also reduces the overload. The performed experiments (with and without the optimization) using a video streaming delivery between two hosts demonstrate the feasibility of our monitoring proposal.

Keywords: optimization, monitoring, software defined networking, statistics, query

Procedia PDF Downloads 331
29910 True Single SKU Script: Applying the Automated Test to Set Software Properties in a Global Software Development Environment

Authors: Antonio Brigido, Maria Meireles, Francisco Barros, Gaspar Mota, Fernanda Terra, Lidia Melo, Marcelo Reis, Camilo Souza

Abstract:

As the globalization of the software process advances, companies are increasingly committed to improving software development technologies across multiple locations. On the other hand, working with teams distributed in different locations also raises new challenges. In this sense, automated processes can help to improve the quality of process execution. Therefore, this work presents the development of a tool called TSS Script that automates the sample preparation process for carrier requirements validation tests. The objective of the work is to obtain significant gains in execution time and reducing errors in scenario preparation. To estimate the gains over time, the executions performed in an automated and manual way were timed. In addition, a questionnaire-based survey was developed to discover new requirements and improvements to include in this automated support. The results show an average gain of 46.67% of the total hours worked, referring to sample preparation. The use of the tool avoids human errors, and for this reason, it adds greater quality and speed to the process. Another relevant factor is the fact that the tester can perform other activities in parallel with sample preparation.

Keywords: Android, GSD, automated testing tool, mobile products

Procedia PDF Downloads 317
29909 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 88
29908 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 70
29907 The Analysis of Brain Response to Auditory Stimuli through EEG Signals’ Non-Linear Analysis

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to auditory stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to auditory stimuli but provide us with very good recommendations for clinical purposes.

Keywords: auditory stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 534
29906 Running the Athena Vortex Lattice Code in JAVA through the Java Native Interface

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL, JNI

Procedia PDF Downloads 565
29905 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements

Authors: Agnieszka A. Malinowska, R. Hejmanowski

Abstract:

A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.

Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)

Procedia PDF Downloads 159
29904 Roadway Maintenance Management System

Authors: Chika Catherine Ayogu

Abstract:

Rehabilitation plays an important and integral part in the life of roadway rehabilitation management system. It is a systematic method for inspection and rating the roadway condition in a given area. The system performs a cost effective analysis of various maintenance and rehabilitation strategies. Finally the system prioritize and recommend roadway rehabilitation and maintenance to maximize results within a given budget amount. During execution of maintenance activity, the system also tracks labour, materials, equipment and cost for activities performed. The system implements physical assessment field inspection and rating of each street segment which is then entered into a database. The information is analyzed using a software, and provide recommendations and project future conditions. The roadway management system provides a deterioration curve for each segment based on input then assigns the most cost-effective maintenance strategy based on conditions, surface type and functional classification, and available budget. This paper investigates the roadway management system and its capabilities to assist in applying the right treatment to the right roadway at the right time so that expected service life of the roadway is extended as long as possible with acceptable cost.

Keywords: effectiveness, rehabilitation, roadway, software system

Procedia PDF Downloads 150
29903 [Keynote Talk]: The Challenges and Solutions for Developing Mobile Apps in a Small University

Authors: Greg Turner, Bin Lu, Cheer-Sun Yang

Abstract:

As computing technology advances, smartphone applications can assist in student learning in a pervasive way. For example, the idea of using a mobile apps for the PA Common Trees, Pests, Pathogens, in the field as a reference tool allows middle school students to learn about trees and associated pests/pathogens without bringing a textbook. In the past, some researches study the mobile software Mobile Application Software Development Life Cycle (MADLC) including traditional models such as the waterfall model, or more recent Agile Methods. Others study the issues related to the software development process. Very little research is on the development of three heterogenous mobile systems simultaneously in a small university where the availability of developers is an issue. In this paper, we propose to use a hybride model of Waterfall Model and the Agile Model, known as the Relay Race Methodology (RRM) in practice, to reflect the concept of racing and relaying for scheduling. Based on the development project, we observe that the modeling of the transition between any two phases is manifested naturally. Thus, we claim that the RRM model can provide a de fecto rather than a de jure basis for the core concept in the MADLC. In this paper, the background of the project is introduced first. Then, the challenges are pointed out followed by our solutions. Finally, the experiences learned and the future work are presented.

Keywords: agile methods, mobile apps, software process model, waterfall model

Procedia PDF Downloads 409
29902 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process

Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma

Abstract:

As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.

Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis

Procedia PDF Downloads 100
29901 Analysis of Sediment Distribution around Karang Sela Coral Reef Using Multibeam Backscatter

Authors: Razak Zakariya, Fazliana Mustajap, Lenny Sharinee Sakai

Abstract:

A sediment map is quite important in the marine environment. The sediment itself contains thousands of information that can be used for other research. This study was conducted by using a multibeam echo sounder Reson T20 on 15 August 2020 at the Karang Sela (coral reef area) at Pulau Bidong. The study aims to identify the sediment type around the coral reef by using bathymetry and backscatter data. The sediment in the study area was collected as ground truthing data to verify the classification of the seabed. A dry sieving method was used to analyze the sediment sample by using a sieve shaker. PDS 2000 software was used for data acquisition, and Qimera QPS version 2.4.5 was used for processing the bathymetry data. Meanwhile, FMGT QPS version 7.10 processes the backscatter data. Then, backscatter data were analyzed by using the maximum likelihood classification tool in ArcGIS version 10.8 software. The result identified three types of sediments around the coral which were very coarse sand, coarse sand, and medium sand.

Keywords: sediment type, MBES echo sounder, backscatter, ArcGIS

Procedia PDF Downloads 86
29900 Reliability of Dry Tissues Sampled from Exhumed Bodies in DNA Analysis

Authors: V. Agostini, S. Gino, S. Inturri, A. Piccinini

Abstract:

In cases of corpse identification or parental testing performed on exhumed alleged dead father, usually, we seek and acquire organic samples as bones and/or bone fragments, teeth, nails and muscle’s fragments. The DNA analysis of these cadaveric matrices usually leads to identifying success, but it often happens that the results of the typing are not satisfactory with highly degraded, partial or even non-interpretable genetic profiles. To aggravate the interpretative panorama deriving from the analysis of such 'classical' organic matrices, we must add a long and laborious treatment of the sample that starts from the mechanical fragmentation up to the protracted decalcification phase. These steps greatly increase the chance of sample contamination. In the present work, instead, we want to report the use of 'unusual' cadaveric matrices, demonstrating that their forensic genetics analysis can lead to better results in less time and with lower costs of reagents. We report six case reports, result of on-field experience, in which eyeswabs and cartilage were sampled and analyzed, allowing to obtain clear single genetic profiles, useful for identification purposes. In all cases we used the standard DNA tissue extraction protocols (as reported on the user manuals of the manufacturers such as QIAGEN or Invitrogen- Thermo Fisher Scientific), thus bypassing the long and difficult phases of mechanical fragmentation and decalcification of bones' samples. PCR was carried out using PowerPlex® Fusion System kit (Promega), and capillary electrophoresis was carried out on an ABI PRISM® 310 Genetic Analyzer (Applied Biosystems®), with GeneMapper ID v3.2.1 (Applied Biosystems®) software. The software Familias (version 3.1.3) was employed for kinship analysis. The genetic results achieved have proved to be much better than the analysis of bones or nails, both from the qualitative and quantitative point of view and from the point of view of costs and timing. This way, by using the standard procedure of DNA extraction from tissue, it is possible to obtain, in a shorter time and with maximum efficiency, an excellent genetic profile, which proves to be useful and can be easily decoded for later paternity tests and/or identification of human remains.

Keywords: DNA, eye swabs and cartilage, identification human remains, paternity testing

Procedia PDF Downloads 109
29899 Modernization of the Economic Price Adjustment Software

Authors: Roger L. Goodwin

Abstract:

The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. In mid to late 1990, much research went into changes to the CPI by a Congressional Advisory Committee. One thing can be said from the research is that, aside from there are alternative estimators for the CPI; any fundamental change to the CPI will affect many government programs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for long-term contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.

Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures

Procedia PDF Downloads 317
29898 Closed Loop Traffic Control System Using PLC

Authors: Chinmay Shah

Abstract:

The project is all about development of a close loop traffic light control system using PLC (Programmable Logic Controller). This project is divided into two parts which are hardware and software. The hardware part for this project is a model of four way junction of a traffic light. Three indicator lamps (Red, Yellow and Green) are installed at each lane for represents as traffic light signal. This traffic control model is a replica of actuated traffic control. Actuated traffic control system is a close loop traffic control system which controls the timing of the indicator lamps depending on the fluidity of traffic for a particular lane. To make it autonomous, in each lane three IR sensors are placed which helps to sense the percentage of traffic present on any particular lane. The IR Sensors and Indicator lamps are connected to LG PLC XGB series. The PLC controls every signal which is coming from the inputs (IR Sensors) to software and display to the outputs (Indicator lamps). Default timing for the indicator lamps is 30 seconds for each lane. But depending on the percentage of traffic present, if the traffic is nearly 30-35%, green lamp will be on for 10 seconds, for 65-70% traffic it will be 20 seconds, for full 100% traffic it will be on for full 30 seconds. The software part that operates with LG PLC is “XG 5000” Programmer. Using this software, the ladder logic diagram is programmed to control the traffic light base on the flow chart. At the end of this project, the traffic light system is actuated successfully by PLC.

Keywords: close loop, IR sensor, PLC, light control system

Procedia PDF Downloads 571
29897 Scalable Cloud-Based LEO Satellite Constellation Simulator

Authors: Karim Sobh, Khaled El-Ayat, Fady Morcos, Amr El-Kadi

Abstract:

Distributed applications deployed on LEO satellites and ground stations require substantial communication between different members in a constellation to overcome the earth coverage barriers imposed by GEOs. Applications running on LEO constellations suffer the earth line-of-sight blockage effect. They need adequate lab testing before launching to space. We propose a scalable cloud-based net-work simulation framework to simulate problems created by the earth line-of-sight blockage. The framework utilized cloud IaaS virtual machines to simulate LEO satellites and ground stations distributed software. A factorial ANOVA statistical analysis is conducted to measure simulator overhead on overall communication performance. The results showed a very low simulator communication overhead. Consequently, the simulation framework is proposed as a candidate for testing LEO constellations with distributed software in the lab before space launch.

Keywords: LEO, cloud computing, constellation, satellite, network simulation, netfilter

Procedia PDF Downloads 386
29896 Application of Artificial Neural Network for Prediction of Load-Haul-Dump Machine Performance Characteristics

Authors: J. Balaraju, M. Govinda Raj, C. S. N. Murthy

Abstract:

Every industry is constantly looking for enhancement of its day to day production and productivity. This can be possible only by maintaining the men and machinery at its adequate level. Prediction of performance characteristics plays an important role in performance evaluation of the equipment. Analytical and statistical approaches will take a bit more time to solve complex problems such as performance estimations as compared with software-based approaches. Keeping this in view the present study deals with an Artificial Neural Network (ANN) modelling of a Load-Haul-Dump (LHD) machine to predict the performance characteristics such as reliability, availability and preventive maintenance (PM). A feed-forward-back-propagation ANN technique has been used to model the Levenberg-Marquardt (LM) training algorithm. The performance characteristics were computed using Isograph Reliability Workbench 13.0 software. These computed values were validated using predicted output responses of ANN models. Further, recommendations are given to the industry based on the performed analysis for improvement of equipment performance.

Keywords: load-haul-dump, LHD, artificial neural network, ANN, performance, reliability, availability, preventive maintenance

Procedia PDF Downloads 150
29895 Seismic Assessment of Old Existing RC Buildings In Madinah with Masonry Infilled Using Ambient Vibration Measurements

Authors: Tarek M. Alguhane, Ayman H. Khalil, Nour M. Fayed, Ayman M. Ismail

Abstract:

Early, pre-code, reinforced concrete structures present undetermined resistance to earthquakes. This situation is particularly unacceptable in the case of essential structures, such as healthcare structures and pilgrims' houses. Among these, existing old RC building in Madinah is seismically evaluated with and without infill wall and their dynamic characteristics are compared with measured values in the field using ambient vibration measurements (AVM). After, updating the mathematical models for this building with the experimental results, three dimensional pushover analysis (Nonlinear static analysis) was carried out using SAP 2000 software incorporating inelastic material properties for concrete, infill and steel. The purpose of this analysis is to evaluate the expected performance of structural systems by estimating, strength and deformation demands in design, and comparing these demands to available capacities at the performance levels of interest. The results are summarized and discussed.

Keywords: seismic assessment, pushover analysis ambient vibration, modal update

Procedia PDF Downloads 497
29894 The Impact of Supply Chain Relationship Quality on Cooperative Strategy and Visibility

Authors: Jung-Hsuan Hsu

Abstract:

Due to intense competition within the industry, companies have increasingly recognized partnerships with other companies. In addition, with outsourcing and globalization of the supply chain, it leads to companies' increasing reliance on external resources. Consequently, supply chain network becomes complex, so that it reduces the visibility of the manufacturing process. Therefore, this study is going to focus on the impact of supply chain relationship quality (SCRQ) on cooperative strategy and visibility. Questionnaire survey is going to be conducted as research method, using the organic food industry as the research subject, and the sampling method is random sampling. Finally, the data analysis will use SPSS statistical software and AMOS software to analyze and verify the hypothesis. The expected results in this study is to evaluate the supply chain relationship quality between Taiwan's food manufacturing and their suppliers regarding whether it has a positive impact for the persistence, frequency and diversity of cooperative strategy, as well as the dimensions of supply chain relationship quality on visibility regarding whether it has a positive effect.

Keywords: supply chain relationship quality (SCRQ), cooperative strategy, visibility, competition

Procedia PDF Downloads 451
29893 Software Verification of Systematic Resampling for Optimization of Particle Filters

Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey

Abstract:

Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.

Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking

Procedia PDF Downloads 84
29892 Comparing Energy Labelling of Buildings in Spain

Authors: Carolina Aparicio-Fernández, Alejandro Vilar Abad, Mar Cañada Soriano, Jose-Luis Vivancos

Abstract:

The building sector is responsible for 40% of the total energy consumption in the European Union (EU). Thus, implementation of strategies for quantifying and reducing buildings energy consumption is indispensable for reaching the EU’s carbon neutrality and energy efficiency goals. Each Member State has transposed the European Directives according to its own peculiarities: existing technical legislation, constructive solutions, climatic zones, etc. Therefore, in accordance with the Energy Performance of Buildings Directive, Member States have developed different Energy Performance Certificate schemes, using proposed energy simulation software-tool for each national or regional area. Energy Performance Certificates provide a powerful and comprehensive information to predict, analyze and improve the energy demand of new and existing buildings. Energy simulation software and databases allow a better understanding of the current constructive reality of the European building stock. However, Energy Performance Certificates still have to face several issues to consider them as a reliable and global source of information since different calculation tools are used that do not allow the connection between them. In this document, TRNSYS (TRaNsient System Simulation program) software is used to calculate the energy demand of a building, and it is compared with the energy labeling obtained with Spanish Official software-tools. We demonstrate the possibility of using not official software-tools to calculate the Energy Performance Certificate. Thus, this approach could be used throughout the EU and compare the results in all possible cases proposed by the EU Member States. To implement the simulations, an isolated single-family house with different construction solutions is considered. The results are obtained for every climatic zone of the Spanish Technical Building Code.

Keywords: energy demand, energy performance certificate EPBD, trnsys, buildings

Procedia PDF Downloads 126
29891 A Framework of Virtualized Software Controller for Smart Manufacturing

Authors: Pin Xiu Chen, Shang Liang Chen

Abstract:

A virtualized software controller is developed in this research to replace traditional hardware control units. This virtualized software controller transfers motion interpolation calculations from the motion control units of end devices to edge computing platforms, thereby reducing the end devices' computational load and hardware requirements and making maintenance and updates easier. The study also applies the concept of microservices, dividing the control system into several small functional modules and then deploy into a cloud data server. This reduces the interdependency among modules and enhances the overall system's flexibility and scalability. Finally, with containerization technology, the system can be deployed and started in a matter of seconds, which is more efficient than traditional virtual machine deployment methods. Furthermore, this virtualized software controller communicates with end control devices via wireless networks, making the placement of production equipment or the redesign of processes more flexible and no longer limited by physical wiring. To handle the large data flow and maintain low-latency transmission, this study integrates 5G technology, fully utilizing its high speed, wide bandwidth, and low latency features to achieve rapid and stable remote machine control. An experimental setup is designed to verify the feasibility and test the performance of this framework. This study designs a smart manufacturing site with a 5G communication architecture, serving as a field for experimental data collection and performance testing. The smart manufacturing site includes one robotic arm, three Computer Numerical Control machine tools, several Input/Output ports, and an edge computing architecture. All machinery information is uploaded to edge computing servers and cloud servers via 5G communication and the Internet of Things framework. After analysis and computation, this information is converted into motion control commands, which are transmitted back to the relevant machinery for motion control through 5G communication. The communication time intervals at each stage are calculated using the C++ chrono library to measure the time difference for each command transmission. The relevant test results will be organized and displayed in the full-text.

Keywords: 5G, MEC, microservices, virtualized software controller, smart manufacturing

Procedia PDF Downloads 82
29890 Evaluating Emission Reduction Due to a Proposed Light Rail Service: A Micro-Level Analysis

Authors: Saeid Eshghi, Neeraj Saxena, Abdulmajeed Alsultan

Abstract:

Carbon dioxide (CO2) alongside other gas emissions in the atmosphere cause a greenhouse effect, resulting in an increase of the average temperature of the planet. Transportation vehicles are among the main contributors of CO2 emission. Stationary vehicles with initiated motors produce more emissions than mobile ones. Intersections with traffic lights that force the vehicles to become stationary for a period of time produce more CO2 pollution than other parts of the road. This paper focuses on analyzing the CO2 produced by the traffic flow at Anzac Parade Road - Barker Street intersection in Sydney, Australia, before and after the implementation of Light rail transport (LRT). The data are gathered during the construction phase of the LRT by collecting the number of vehicles on each path of the intersection for 15 minutes during the evening rush hour of 1 week (6-7 pm, July 04-31, 2018) and then multiplied by 4 to calculate the flow of vehicles in 1 hour. For analyzing the data, the microscopic simulation software “VISSIM” has been used. Through the analysis, the traffic flow was processed in three stages: before and after implementation of light rail train, and one during the construction phase. Finally, the traffic results were input into another software called “EnViVer”, to calculate the amount of CO2 during 1 h. The results showed that after the implementation of the light rail, CO2 will drop by a minimum of 13%. This finding provides an evidence that light rail is a sustainable mode of transport.

Keywords: carbon dioxide, emission modeling, light rail, microscopic model, traffic flow

Procedia PDF Downloads 142