Search results for: automated quantification
753 Catchment Yield Prediction in an Ungauged Basin Using PyTOPKAPI
Authors: B. S. Fatoyinbo, D. Stretch, O. T. Amoo, D. Allopi
Abstract:
This study extends the use of the Drainage Area Regionalization (DAR) method in generating synthetic data and calibrating PyTOPKAPI stream yield for an ungauged basin at a daily time scale. The generation of runoff in determining a river yield has been subjected to various topographic and spatial meteorological variables, which integers form the Catchment Characteristics Model (CCM). Many of the conventional CCM models adapted in Africa have been challenged with a paucity of adequate, relevance and accurate data to parameterize and validate the potential. The purpose of generating synthetic flow is to test a hydrological model, which will not suffer from the impact of very low flows or very high flows, thus allowing to check whether the model is structurally sound enough or not. The employed physically-based, watershed-scale hydrologic model (PyTOPKAPI) was parameterized with GIS-pre-processing parameters and remote sensing hydro-meteorological variables. The validation with mean annual runoff ratio proposes a decent graphical understanding between observed and the simulated discharge. The Nash-Sutcliffe efficiency and coefficient of determination (R²) values of 0.704 and 0.739 proves strong model efficiency. Given the current climate variability impact, water planner can now assert a tool for flow quantification and sustainable planning purposes.Keywords: catchment characteristics model, GIS, synthetic data, ungauged basin
Procedia PDF Downloads 327752 SVID: Structured Vulnerability Intelligence for Building Deliberated Vulnerable Environment
Authors: Wenqing Fan, Yixuan Cheng, Wei Huang
Abstract:
The diversity and complexity of modern IT systems make it almost impossible for internal teams to find vulnerabilities in all software before the software is officially released. The emergence of threat intelligence and vulnerability reporting policy has greatly reduced the burden on software vendors and organizations to find vulnerabilities. However, to prove the existence of the reported vulnerability, it is necessary but difficult for security incident response team to build a deliberated vulnerable environment from the vulnerability report with limited and incomplete information. This paper presents a structured, standardized, machine-oriented vulnerability intelligence format, that can be used to automate the orchestration of Deliberated Vulnerable Environment (DVE). This paper highlights the important role of software configuration and proof of vulnerable specifications in vulnerability intelligence, and proposes a triad model, which is called DIR (Dependency Configuration, Installation Configuration, Runtime Configuration), to define software configuration. Finally, this paper has also implemented a prototype system to demonstrate that the orchestration of DVE can be automated with the intelligence.Keywords: DIR triad model, DVE, vulnerability intelligence, vulnerability recurrence
Procedia PDF Downloads 121751 Correlation of Hyperlipidemia with Platelet Parameters in Blood Donors
Authors: S. Nishat Fatima Rizvi, Tulika Chandra, Abbas Ali Mahdi, Devisha Agarwal
Abstract:
Introduction: Blood components are an unexplored area prone to numerous discoveries which influence patient’s care. Experiments at different levels will further change the present concept of blood banking. Hyperlipidemia is a condition of elevated plasma level of low-density lipoprotein (LDL) as well as decreased plasma level of high-density lipoprotein (HDL). Studies show that platelets play a vital role in the progression of atherosclerosis and thrombosis, a major cause of death worldwide. They are activated by many triggers like elevated LDL in the blood resulting in aggregation and formation of plaques. Hyperlipidemic platelets are frequently transfused to patients with various disorders. Screening the random donor platelets for hyperlipidemia and correlating the condition with other donor criteria such as lipid rich diet, oral contraceptive pills intake, weight, alcohol intake, smoking, sedentary lifestyle, family history of heart diseases will lead to further deciding the exclusion criteria for donor selection. This will help in making the patients safe as well as the donor deferral criteria more stringent to improve the quality of blood supply. Technical evaluation and assessment will enable blood bankers to supply safe blood and improve the guidelines for blood safety. Thus, we try to study the correlation between hyperlipidemic platelets with platelets parameters, weight, and specific history of the donors. Methodology: This case control study included 100 blood samples of Blood donors, out of 100 only 30 samples were found to be hyperlipidemic and were included as cases, while rest were taken as controls. Lipid Profile were measured by fully automated analyzer (TRIGL:triglycerides),(LDL-C:LDL –Cholesterol plus 2nd generation),CHOL 2: Cholesterol Gen 2), HDL C 3: HDL-Cholesterol plus 3rdgeneration)-(Cobas C311-Roche Diagnostic).And Platelets parameters were analyzed by the Sysmex KX21 automated hematology analyzer. Results: A significant correlation was found amongst hyperlipidemic level in single time donor. In which 80% donors have history of heart disease, 66.66% donors have sedentary life style, 83.3% donors were smokers, 50% donors were alcoholic, and 63.33% donors had taken lipid rich diet. Active physical activity was found amongst 40% donors. We divided donors sample in two groups based on their body weight. In group 1, hyperlipidemic samples: Platelet Parameters were 75% in normal 25% abnormal in >70Kg weight while in 50-70Kg weight 90% were normal 10% were abnormal. In-group 2, Non Hyperlipidemic samples: platelet Parameters were 95% normal and 5% abnormal in >70Kg weight, while in 50-70Kg Weight, 66.66% normal and 33.33% abnormal. Conclusion: The findings indicate that Hyperlipidemic status of donors may affect the platelet parameters and can be distinguished on history by their weight, Smoking, Alcoholic intake, Sedentary lifestyle, Active physical activity, Lipid rich diet, Oral contraceptive pills intake, and Family history of heart disease. However further studies on a large sample size will affirm this finding.Keywords: blood donors, hyperlipidemia, platelet, weight
Procedia PDF Downloads 314750 Demographic Dividend Explained by Infrastructure Costs of Population Growth Rate, Distinct from Age Dependency
Authors: Jane N. O'Sullivan
Abstract:
Although it is widely believed that fertility decline has benefitted economic advancement, particularly in East and South-East Asian countries, the causal mechanisms for this stimulus are contested. Since the turn of this century, demographic dividend theory has been increasingly recognised, hypothesising that higher proportions of working-age people can contribute to economic expansion if conditions are met to employ them productively. Population growth rate, as a systemic condition distinct from age composition, has not been similar attention since the 1970s and has lacked methodology for quantitative assessment. This paper explores conceptual and empirical quantification of the burden of expanding physical capital to accommodate a growing population. In proof-of-concept analyses of Australia and the United Kingdom, actual expenditure on gross fixed capital formation was compiled over four decades and apportioned to maintenance/turnover or expansion to accommodate population growth, based on lifespan of capital assets and population growth rate. In both countries, capital expansion was estimated to cost 6.5-7.0% of GDP per 1% population growth rate. This opportunity cost impedes the improvement of per capita capacity needed to realise the potential of the working-age population. Economic modelling of demographic scenarios have to date omitted this channel of influence; the implications of its inclusion are discussed.Keywords: age dependency, demographic dividend, infrastructure, population growth rate
Procedia PDF Downloads 143749 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs
Authors: Malo Pocheau-Lesteven, Olivier Le Maître
Abstract:
Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program
Procedia PDF Downloads 157748 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery
Authors: Evans Belly, Imdad Rizvi, M. M. Kadam
Abstract:
Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery
Procedia PDF Downloads 314747 The Characteristics of Quantity Operation for 2nd and 3rd Grade Mathematics Slow Learners
Authors: Pi-Hsia Hung
Abstract:
The development of mathematical competency has individual benefits as well as benefits to the wider society. Children who begin school behind their peers in their understanding of number, counting, and simple arithmetic are at high risk of staying behind throughout their schooling. The development of effective strategies for improving the educational trajectory of these individuals will be contingent on identifying areas of early quantitative knowledge that influence later mathematics achievement. A computer-based quantity assessment was developed in this study to investigate the characteristics of 2nd and 3rd grade slow learners in quantity. The concept of quantification involves understanding measurements, counts, magnitudes, units, indicators, relative size, and numerical trends and patterns. Fifty-five tasks of quantitative reasoning—such as number sense, mental calculation, estimation and assessment of reasonableness of results—are included as quantity problem solving. Thus, quantity is defined in this study as applying knowledge of number and number operations in a wide variety of authentic settings. Around 1000 students were tested and categorized into 4 different performance levels. Students’ quantity ability correlated higher with their school math grade than other subjects. Around 20% students are below basic level. The intervention design implications of the preliminary item map constructed are discussed.Keywords: mathematics assessment, mathematical cognition, quantity, number sense, validity
Procedia PDF Downloads 247746 Unreliable Production Lines with Simultaneously Unbalanced Operation Time Means, Breakdown, and Repair Rates
Authors: Sabry Shaaban, Tom McNamara, Sarah Hudson
Abstract:
This paper investigates the benefits of deliberately unbalancing both operation time means (MTs) and unreliability (failure and repair rates) for non-automated production lines.The lines were simulated with various line lengths, buffer capacities, degrees of imbalance and patterns of MT and unreliability imbalance. Data on two performance measures, namely throughput (TR) and average buffer level (ABL) were gathered, analyzed and compared to a balanced line counterpart. A number of conclusions were made with respect to the ranking of configurations, as well as to the relationships among the independent design parameters and the dependent variables. It was found that the best configurations are a balanced line arrangement and a monotone decreasing MT order, coupled with either a decreasing or a bowl unreliability configuration, with the first generally resulting in a reduced TR and the second leading to a lower ABL than those of a balanced line.Keywords: unreliable production lines, unequal mean operation times, unbalanced failure and repair rates, throughput, average buffer level
Procedia PDF Downloads 486745 Quantification of Polychlorinated Biphenyls (PCBs) in Soil Samples of Electrical Power Substations from Different Cities in Nigeria
Authors: Omasan Urhie Urhie, Adenipekun C. O, Eke W., Ogwu K., Erinle K. O
Abstract:
Polychlorinated Biphenyls (PCBs) are Persistent organic pollutants (POPs) that are very toxic; they possess ability to accumulate in soil and in human tissues hence resulting in health issues like birth defect, reproductive disorder and cancer. The air is polluted by PCBs through volatilization and dispersion; they also contaminate soil and sediments and are not easily degraded. Soil samples were collected from a depth of 0-15 cm from three substations (Warri, Ughelli and Ibadan) of Power Holding Company of Nigeria (PHCN) where old transformers were dumped in Nigeria. Extraction and cleanup of soil samples were conducted using Accelerated Solvent Extraction (ASE) with Pressurized Liquid extraction (PLE). The concentration of PCBs was determined using gsas chromatography/mass spectrometry (GC/MS). Mean total PCB concentrations in the soil samples increased in the order Ughelli ˂ Ibadan˂ Warri, 2.457757ppm Ughelli substation 4.198926ppm, for Ibadan substation and 14.05065ppm at Warri substation. In the Warri samples, PCB-167 was the most abundant at about 30% (4.28086ppm) followed by PCB-157 at about 20% (2.77871), of the total PCB concentrations (14.05065ppm). Of the total PCBs in the Ughelli and Ibadan samples, PCB-156 was the most abundant at about 44% and 40%, respectively. This study provides a baseline report on the presence of PCBs in the vicinity of abandoned electrical power facilities in different cities in Nigeria.Keywords: polychlorintated biphenyls, persistent organic pollutants, soil, transformer
Procedia PDF Downloads 139744 Rapid, Automated Characterization of Microplastics Using Laser Direct Infrared Imaging and Spectroscopy
Authors: Andreas Kerstan, Darren Robey, Wesam Alvan, David Troiani
Abstract:
Over the last 3.5 years, Quantum Cascade Lasers (QCL) technology has become increasingly important in infrared (IR) microscopy. The advantages over fourier transform infrared (FTIR) are that large areas of a few square centimeters can be measured in minutes and that the light intensive QCL makes it possible to obtain spectra with excellent S/N, even with just one scan. A firmly established solution of the laser direct infrared imaging (LDIR) 8700 is the analysis of microplastics. The presence of microplastics in the environment, drinking water, and food chains is gaining significant public interest. To study their presence, rapid and reliable characterization of microplastic particles is essential. Significant technical hurdles in microplastic analysis stem from the sheer number of particles to be analyzed in each sample. Total particle counts of several thousand are common in environmental samples, while well-treated bottled drinking water may contain relatively few. While visual microscopy has been used extensively, it is prone to operator error and bias and is limited to particles larger than 300 µm. As a result, vibrational spectroscopic techniques such as Raman and FTIR microscopy have become more popular, however, they are time-consuming. There is a demand for rapid and highly automated techniques to measure particle count size and provide high-quality polymer identification. Analysis directly on the filter that often forms the last stage in sample preparation is highly desirable as, by removing a sample preparation step it can both improve laboratory efficiency and decrease opportunities for error. Recent advances in infrared micro-spectroscopy combining a QCL with scanning optics have created a new paradigm, LDIR. It offers improved speed of analysis as well as high levels of automation. Its mode of operation, however, requires an IR reflective background, and this has, to date, limited the ability to perform direct “on-filter” analysis. This study explores the potential to combine the filter with an infrared reflective surface filter. By combining an IR reflective material or coating on a filter membrane with advanced image analysis and detection algorithms, it is demonstrated that such filters can indeed be used in this way. Vibrational spectroscopic techniques play a vital role in the investigation and understanding of microplastics in the environment and food chain. While vibrational spectroscopy is widely deployed, improvements and novel innovations in these techniques that can increase the speed of analysis and ease of use can provide pathways to higher testing rates and, hence, improved understanding of the impacts of microplastics in the environment. Due to its capability to measure large areas in minutes, its speed, degree of automation and excellent S/N, the LDIR could also implemented for various other samples like food adulteration, coatings, laminates, fabrics, textiles and tissues. This presentation will highlight a few of them and focus on the benefits of the LDIR vs classical techniques.Keywords: QCL, automation, microplastics, tissues, infrared, speed
Procedia PDF Downloads 66743 Unauthorized License Verifier and Secure Access to Vehicle
Authors: G. Prakash, L. Mohamed Aasiq, N. Dhivya, M. Jothi Mani, R. Mounika, B. Gomathi
Abstract:
In our day to day life, many people met with an accident due to various reasons like over speed, overload in the vehicle, violation of the traffic rules, etc. Driving license system is difficult task for the government to monitor. To prevent non-licensees from driving who are causing most of the accidents, a new system is proposed. The proposed system consists of a smart card capable of storing the license details of a particular person. Vehicles such as cars, bikes etc., should have a card reader capable of reading the particular license. A person, who wishes to drive the vehicle, should insert the card (license) in the vehicle and then enter the password in the keypad. If the license data stored in the card and database about the entire license holders in the microcontroller matches, he/she can proceed for ignition after the automated opening of the fuel tank valve, otherwise the user is restricted to use the vehicle. Moreover, overload detector in our proposed system verifies and then prompts the user to avoid overload before driving. This increases the security of vehicles and also ensures safe driving by preventing accidents.Keywords: license, verifier, EEPROM, secure, overload detection
Procedia PDF Downloads 242742 Quantification of Size Segregated Particulate Matter Deposition in Human Respiratory Tract and Health Risk to Residents of Glass City
Authors: Kalpana Rajouriya, Ajay Taneja
Abstract:
The objective of the present study is to investigate the regional and lobar deposition of size-segregated PM in respiratory tract of human body. PM in different fractions is monitored using the Grimm portable environmental dust monitor during winter season in Firozabad; a Glass city of India. PM10 concentration (200.817g/m³) was 4.46 and 2.0 times higher than the limits prescribed by WHO (45g/m⁻³) and NAAQS (100g/m⁻³) government agencies. PM2.5 concentration (83.538 g/m3) was 5.56 and 1.39 times higher from WHO (15g/m-3) and NAAQS (60g/m⁻³) limits. Results inferred that PM10 and PM2.5 was highest deposited in head region (0.3477-0.5622 & 0.366-0.4704) followed by pulmonary region, especially in the 9-21year old persons. The variation in deposition percentage in our study is mainly due to the airway geometry, PM size, and its deposition mechanisms. The coarse fraction, due to its large size, cannot follow the airway path and mostly gets deposited by inertial impaction in the head region and its bifurcations. The present study results inferred that Coarse and fine PM deposition was highly visualized in 9 (8.45610⁻⁴ g, 2.91110⁻⁴g) year and 3 (1.49610⁻⁴ g, 8.59310⁻⁵g) month age category. So, the 9year children and 3month infants category have high level of health risk.Keywords: particulate matter, MPPD model, regional deposition, lobar deposition, health risk
Procedia PDF Downloads 61741 The Grammatical Dictionary Compiler: A System for Kartvelian Languages
Authors: Liana Lortkipanidze, Nino Amirezashvili, Nino Javashvili
Abstract:
The purpose of the grammatical dictionary is to provide information on the morphological and syntactic characteristics of the basic word in the dictionary entry. The electronic grammatical dictionaries are used as a tool of automated morphological analysis for texts processing. The Georgian Grammatical Dictionary should contain grammatical information for each word: part of speech, type of declension/conjugation, grammatical forms of the word (paradigm), alternative variants of basic word/lemma. In this paper, we present the system for compiling the Georgian Grammatical Dictionary automatically. We propose dictionary-based methods for extending grammatical lexicons. The input lexicon contains only a few number of words with identical grammatical features. The extension is based on similarity measures between features of words; more precisely, we add words to the extended lexicons, which are similar to those, which are already in the grammatical dictionary. Our dictionaries are corpora-based, and for the compiling, we introduce the method for lemmatization of unknown words, i.e., words of which neither full form nor lemma is in the grammatical dictionary.Keywords: acquisition of lexicon, Georgian grammatical dictionary, lemmatization rules, morphological processor
Procedia PDF Downloads 145740 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights
Authors: Olga Kokoulina
Abstract:
Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.Keywords: algorithms, public interest, trade secrets, transparency
Procedia PDF Downloads 124739 Fully Autonomous Vertical Farm to Increase Crop Production
Authors: Simone Cinquemani, Lorenzo Mantovani, Aleksander Dabek
Abstract:
New technologies in agriculture are opening new challenges and new opportunities. Among these, certainly, robotics, vision, and artificial intelligence are the ones that will make a significant leap, compared to traditional agricultural techniques, possible. In particular, the indoor farming sector will be the one that will benefit the most from these solutions. Vertical farming is a new field of research where mechanical engineering can bring knowledge and know-how to transform a highly labor-based business into a fully autonomous system. The aim of the research is to develop a multi-purpose, modular, and perfectly integrated platform for crop production in indoor vertical farming. Activities will be based both on hardware development such as automatic tools to perform different activities on soil and plants, as well as research to introduce an extensive use of monitoring techniques based on machine learning algorithms. This paper presents the preliminary results of a research project of a vertical farm living lab designed to (i) develop and test vertical farming cultivation practices, (ii) introduce a very high degree of mechanization and automation that makes all processes replicable, fully measurable, standardized and automated, (iii) develop a coordinated control and management environment for autonomous multiplatform or tele-operated robots in environments with the aim of carrying out complex tasks in the presence of environmental and cultivation constraints, (iv) integrate AI-based algorithms as decision support system to improve quality production. The coordinated management of multiplatform systems still presents innumerable challenges that require a strongly multidisciplinary approach right from the design, development, and implementation phases. The methodology is based on (i) the development of models capable of describing the dynamics of the various platforms and their interactions, (ii) the integrated design of mechatronic systems able to respond to the needs of the context and to exploit the strength characteristics highlighted by the models, (iii) implementation and experimental tests performed to test the real effectiveness of the systems created, evaluate any weaknesses so as to proceed with a targeted development. To these aims, a fully automated laboratory for growing plants in vertical farming has been developed and tested. The living lab makes extensive use of sensors to determine the overall state of the structure, crops, and systems used. The possibility of having specific measurements for each element involved in the cultivation process makes it possible to evaluate the effects of each variable of interest and allows for the creation of a robust model of the system as a whole. The automation of the laboratory is completed with the use of robots to carry out all the necessary operations, from sowing to handling to harvesting. These systems work synergistically thanks to the knowledge of detailed models developed based on the information collected, which allows for deepening the knowledge of these types of crops and guarantees the possibility of tracing every action performed on each single plant. To this end, artificial intelligence algorithms have been developed to allow synergistic operation of all systems.Keywords: automation, vertical farming, robot, artificial intelligence, vision, control
Procedia PDF Downloads 39738 Automation Test Method and HILS Environment Configuration for Hydrogen Storage System Management Unit Verification
Authors: Jaejeogn Kim, Jeongmin Hong, Jungin Lee
Abstract:
The Hydrogen Storage System Management Unit (HMU) is a controller that manages hydrogen charging and storage. It detects hydrogen leaks and tank pressure and temperature, calculates the charging concentration and remaining amount, and controls the opening and closing of the hydrogen tank valve. Since this role is an important part of the vehicle behavior and stability of Fuel Cell Electric Vehicles (FCEV), verifying the HMU controller is an essential part. To perform verification under various conditions, it is necessary to increase time efficiency based on an automated verification environment and increase the reliability of the controller by applying numerous test cases. To this end, we introduce the HMU controller automation verification method by applying the HILS environment and an automation test program with the ASAM XIL standard.Keywords: HILS, ASAM, fuel cell electric vehicle, automation test, hydrogen storage system
Procedia PDF Downloads 70737 Modeling of Building a Conceptual Scheme for Multimodal Freight Transportation Information System
Authors: Gia Surguladze, Nino Topuria, Lily Petriashvili, Giorgi Surguladze
Abstract:
Modeling of building processes of a multimodal freight transportation support information system is discussed based on modern CASE technologies. Functional efficiencies of ports in the eastern part of the Black Sea are analyzed taking into account their ecological, seasonal, resource usage parameters. By resources, we mean capacities of berths, cranes, automotive transport, as well as work crews and neighbouring airports. For the purpose of designing database of computer support system for Managerial (Logistics) function, using Object-Role Modeling (ORM) tool (NORMA – Natural ORM Architecture) is proposed, after which Entity Relationship Model (ERM) is generated in automated process. The software is developed based on Process-Oriented and Service-Oriented architecture, in Visual Studio.NET environment.Keywords: seaport resources, business-processes, multimodal transportation, CASE technology, object-role model, entity relationship model, SOA
Procedia PDF Downloads 430736 Potentials of Additive Manufacturing: An Approach to Increase the Flexibility of Production Systems
Authors: A. Luft, S. Bremen, N. Balc
Abstract:
The task of flexibility planning and design, just like factory planning, for example, is to create the long-term systemic framework that constitutes the restriction for short-term operational management. This is a strategic challenge since, due to the decision defect character of the underlying flexibility problem, multiple types of flexibility need to be considered over the course of various scenarios, production programs, and production system configurations. In this context, an evaluation model has been developed that integrates both conventional and additive resources on a basic task level and allows the quantification of flexibility enhancement in terms of mix and volume flexibility, complexity reduction, and machine capacity. The model helps companies to decide in early decision-making processes about the potential gains of implementing additive manufacturing technologies on a strategic level. For companies, it is essential to consider both additive and conventional manufacturing beyond pure unit costs. It is necessary to achieve an integrative view of manufacturing that incorporates both additive and conventional manufacturing resources and quantifies their potential with regard to flexibility and manufacturing complexity. This also requires a structured process for the strategic production systems design that spans the design of various scenarios and allows for multi-dimensional and comparative analysis. A respective guideline for the planning of additive resources on a strategic level is being laid out in this paper.Keywords: additive manufacturing, production system design, flexibility enhancement, strategic guideline
Procedia PDF Downloads 124735 Characterization of Surface Suction Grippers for Continuous-Discontinuous Fiber Reinforced Semi-Finished Parts of an Automated Handling and Preforming Operation
Authors: Jürgen Fleischer, Woramon Pangboonyanon, Dominic Lesage
Abstract:
Non-metallic lightweight materials such as fiber reinforced plastics (FRP) become very significant at present. Prepregs e.g. SMC and unidirectional tape (UD-tape) are one of raw materials used to produce FRP. This study concerns with the manufacturing steps of handling and preforming of this UD-SMC and focuses on the investigation of gripper characteristics regarding gripping forces in normal and lateral direction, in order to identify suitable operating pressures for a secure gripping operation. A reliable handling and preforming operation results in a higher adding value of the overall process chain. As a result, the suitable operating pressures depending on travelling direction for each material type could be shown. Moreover, system boundary conditions regarding allowable pulling force in normal and lateral directions during preforming could be measured.Keywords: continuous-discontinuous fiber reinforced plastics, UD-SMC-prepreg, handling, preforming, prepregs, sheet moulding compounds, surface suction gripper
Procedia PDF Downloads 222734 Multiresolution Mesh Blending for Surface Detail Reconstruction
Authors: Honorio Salmeron Valdivieso, Andy Keane, David Toal
Abstract:
In the area of mechanical reverse engineering, processes often encounter difficulties capturing small, highly localized surface information. This could be the case if a physical turbine was 3D scanned for lifecycle management or robust design purposes, with interest on eroded areas or scratched coating. The limitation partly is due to insufficient automated frameworks for handling -localized - surface information during the reverse engineering pipeline. We have developed a tool for blending surface patches with arbitrary irregularities into a base body (e.g. a CAD solid). The approach aims to transfer small surface features while preserving their shape and relative placement by using a multi-resolution scheme and rigid deformations. Automating this process enables the inclusion of outsourced surface information in CAD models, including samples prepared in mesh handling software, or raw scan information discarded in the early stages of reverse engineering reconstruction.Keywords: application lifecycle management, multiresolution deformation, reverse engineering, robust design, surface blending
Procedia PDF Downloads 139733 Quantification of Glucosinolates in Turnip Greens and Turnip Tops by Near-Infrared Spectroscopy
Authors: S. Obregon-Cano, R. Moreno-Rojas, E. Cartea-Gonzalez, A. De Haro-Bailon
Abstract:
The potential of near-infrared spectroscopy (NIRS) for screening the total glucosinolate (t-GSL) content, and also, the aliphatic glucosinolates gluconapin (GNA), progoitrin (PRO) and glucobrassicanapin (GBN) in turnip greens and turnip tops was assessed. This crop is grown for edible leaves and stems for human consumption. The reference values for glucosinolates, as they were obtained by high performance liquid chromatography on the vegetable samples, were regressed against different spectral transformations by modified partial least-squares (MPLS) regression (calibration set of samples n= 350). The resulting models were satisfactory, with calibration coefficient values from 0.72 (GBN) to 0.98 (tGSL). The predictive ability of the equations obtained was tested using a set of samples (n=70) independent of the calibration set. The determination coefficients and prediction errors (SEP) obtained in the external validation were: GNA=0.94 (SEP=3.49); PRO=0.41 (SEP=1.08); GBN=0.55 (SEP=0.60); tGSL=0.96 (SEP=3.28). These results show that the equations developed for total glucosinolates, as well as for gluconapin can be used for screening these compounds in the leaves and stems of this species. In addition, the progoitrin and glucobrassicanapin equations obtained can be used to identify those samples with high, medium and low contents. The calibration equations obtained were accurate enough for a fast, non-destructive and reliable analysis of the content in GNA and tGSL directly from NIR spectra. The equations for PRO and GBN can be employed to identify samples with high, medium and low contents.Keywords: brassica rapa, glucosinolates, gluconapin, NIRS, turnip greens
Procedia PDF Downloads 144732 Identify Users Behavior from Mobile Web Access Logs Using Automated Log Analyzer
Authors: Bharat P. Modi, Jayesh M. Patel
Abstract:
Mobile Internet is acting as a major source of data. As the number of web pages continues to grow the Mobile web provides the data miners with just the right ingredients for extracting information. In order to cater to this growing need, a special term called Mobile Web mining was coined. Mobile Web mining makes use of data mining techniques and deciphers potentially useful information from web data. Web Usage mining deals with understanding the behavior of users by making use of Mobile Web Access Logs that are generated on the server while the user is accessing the website. A Web access log comprises of various entries like the name of the user, his IP address, a number of bytes transferred time-stamp etc. A variety of Log Analyzer tools exists which help in analyzing various things like users navigational pattern, the part of the website the users are mostly interested in etc. The present paper makes use of such log analyzer tool called Mobile Web Log Expert for ascertaining the behavior of users who access an astrology website. It also provides a comparative study between a few log analyzer tools available.Keywords: mobile web access logs, web usage mining, web server, log analyzer
Procedia PDF Downloads 361731 Assessment of Smart Mechatronics Application in Agriculture
Authors: Sairoel Amertet, Girma Gebresenbet
Abstract:
Smart mechatronics systems in agriculture can be traced back to the mid-1980s, when research into automated fruit harvesting systems began in Japan, Europe, and the United States. Since then, impressive advances have been made in smart mechatronics systems. Furthermore, smart mechatronics systems are promising areas, and as a result, we were intrigued to learn more about them. Consequently, the purpose of this study was to examine the smart mechatronic systems that have been applied to agricultural areas so far, with inspiration from the smart mechatronic system in other sectors. To get an overview of the current state of the art, benefits and drawbacks of smart mechatronics systems, various approaches were investigated. Moreover, smart mechatronic modules and various networks applied in agriculture processing were examined. Finally, we explored how the data retrieved using the one-way analysis of variance related to each other. The result showed that there were strongly related keywords for different journals. With the virtually limited use of sophisticated mechatronics in the agricultural industry and, at the same time, the low production rate, the demand for food security has fallen dramatically. Therefore, the application of smart mechatronics systems in agricultural sectors would be taken into consideration in order to overcome these issues.Keywords: mechatronics, robotic, robotic system, automation, agriculture mechanism
Procedia PDF Downloads 80730 Low Cost Real Time Robust Identification of Impulsive Signals
Authors: R. Biondi, G. Dys, G. Ferone, T. Renard, M. Zysman
Abstract:
This paper describes an automated implementable system for impulsive signals detection and recognition. The system uses a Digital Signal Processing device for the detection and identification process. Here the system analyses the signals in real time in order to produce a particular response if needed. The system analyses the signals in real time in order to produce a specific output if needed. Detection is achieved through normalizing the inputs and comparing the read signals to a dynamic threshold and thus avoiding detections linked to loud or fluctuating environing noise. Identification is done through neuronal network algorithms. As a setup our system can receive signals to “learn” certain patterns. Through “learning” the system can recognize signals faster, inducing flexibility to new patterns similar to those known. Sound is captured through a simple jack input, and could be changed for an enhanced recording surface such as a wide-area recorder. Furthermore a communication module can be added to the apparatus to send alerts to another interface if needed.Keywords: sound detection, impulsive signal, background noise, neural network
Procedia PDF Downloads 319729 Provenance in Scholarly Publications: Introducing the provCite Ontology
Authors: Maria Joseph Israel, Ahmed Amer
Abstract:
Our work aims to broaden the application of provenance technology beyond its traditional domains of scientific workflow management and database systems by offering a general provenance framework to capture richer and extensible metadata in unstructured textual data sources such as literary texts, commentaries, translations, and digital humanities. Specifically, we demonstrate the feasibility of capturing and representing expressive provenance metadata, including more of the context for citing scholarly works (e.g., the authors’ explicit or inferred intentions at the time of developing his/her research content for publication), while also supporting subsequent augmentation with similar additional metadata (by third parties, be they human or automated). To better capture the nature and types of possible citations, in our proposed provenance scheme metaScribe, we extend standard provenance conceptual models to form our proposed provCite ontology. This provides a conceptual framework which can accurately capture and describe more of the functional and rhetorical properties of a citation than can be achieved with any current models.Keywords: knowledge representation, provenance architecture, ontology, metadata, bibliographic citation, semantic web annotation
Procedia PDF Downloads 117728 Point-of-Interest Recommender Systems for Location-Based Social Network Services
Authors: Hoyeon Park, Yunhwan Keon, Kyoung-Jae Kim
Abstract:
Location Based Social Network services (LBSNs) is a new term that combines location based service and social network service (SNS). Unlike traditional SNS, LBSNs emphasizes empirical elements in the user's actual physical location. Point-of-Interest (POI) is the most important factor to implement LBSNs recommendation system. POI information is the most popular spot in the area. In this study, we would like to recommend POI to users in a specific area through recommendation system using collaborative filtering. The process is as follows: first, we will use different data sets based on Seoul and New York to find interesting results on human behavior. Secondly, based on the location-based activity information obtained from the personalized LBSNs, we have devised a new rating that defines the user's preference for the area. Finally, we have developed an automated rating algorithm from massive raw data using distributed systems to reduce advertising costs of LBSNs.Keywords: location-based social network services, point-of-interest, recommender systems, business analytics
Procedia PDF Downloads 229727 Quantification of Peptides (linusorbs) in Gluten-free Flaxseed Fortified Bakery Products
Authors: Youn Young Shim, Ji Hye Kim, Jae Youl Cho, Martin JT Reaney
Abstract:
Flaxseed (Linumusitatissimum L.) is gaining popularity in the food industry as a superfood due to its health-promoting properties. Linusorbs (LOs, a.k.a. Cyclolinopeptide) are bioactive compounds present in flaxseed exhibiting potential health effects. The study focused on the effects of processing and storage on the stability of flaxseed-derived LOs added to various bakery products. The flaxseed meal fortified gluten-free (GF) bakery bread was prepared, and the changes of LOs during the bread-making process (meal, fortified flour, dough, and bread) and storage (0, 1, 2, and 4 weeks) at different temperatures (−18 °C, 4 °C, and 22−23 °C) were analyzed by high-performance liquid chromatography-diode array detection. The total oxidative LOs and LO1OB2 were almost kept stable in flaxseed meals at storage temperatures of 22−23 °C, −18 °C, and 4 °C for up to four weeks. Processing steps during GF-bread production resulted in the oxidation of LOs. Interestingly, no LOs were detected in the dough sample; however, LOs appeared when the dough was stored at −18 °C for one week, suggesting that freezing destroyed the sticky structure of the dough and resulted in the release of LOs. The final product, flaxseed meal fortified bread, could be stored for up to four weeks at −18 °C and 4 °C, and for one week at 22−23 °C. All these results suggested that LOs may change during processing and storage and that flaxseed flour-fortified bread should be stored at low temperatures to preserve effective LOs components.Keywords: linum usitatissimum L., flaxseed, linusorb, stability, gluten-free, peptides, cyclolinopeptide
Procedia PDF Downloads 179726 Microarray Gene Expression Data Dimensionality Reduction Using PCA
Authors: Fuad M. Alkoot
Abstract:
Different experimental technologies such as microarray sequencing have been proposed to generate high-resolution genetic data, in order to understand the complex dynamic interactions between complex diseases and the biological system components of genes and gene products. However, the generated samples have a very large dimension reaching thousands. Therefore, hindering all attempts to design a classifier system that can identify diseases based on such data. Additionally, the high overlap in the class distributions makes the task more difficult. The data we experiment with is generated for the identification of autism. It includes 142 samples, which is small compared to the large dimension of the data. The classifier systems trained on this data yield very low classification rates that are almost equivalent to a guess. We aim at reducing the data dimension and improve it for classification. Here, we experiment with applying a multistage PCA on the genetic data to reduce its dimensionality. Results show a significant improvement in the classification rates which increases the possibility of building an automated system for autism detection.Keywords: PCA, gene expression, dimensionality reduction, classification, autism
Procedia PDF Downloads 560725 Automated Process Quality Monitoring and Diagnostics for Large-Scale Measurement Data
Authors: Hyun-Woo Cho
Abstract:
Continuous monitoring of industrial plants is one of necessary tasks when it comes to ensuring high-quality final products. In terms of monitoring and diagnosis, it is quite critical and important to detect some incipient abnormal events of manufacturing processes in order to improve safety and reliability of operations involved and to reduce related losses. In this work a new multivariate statistical online diagnostic method is presented using a case study. For building some reference models an empirical discriminant model is constructed based on various past operation runs. When a fault is detected on-line, an on-line diagnostic module is initiated. Finally, the status of the current operating conditions is compared with the reference model to make a diagnostic decision. The performance of the presented framework is evaluated using a dataset from complex industrial processes. It has been shown that the proposed diagnostic method outperforms other techniques especially in terms of incipient detection of any faults occurred.Keywords: data mining, empirical model, on-line diagnostics, process fault, process monitoring
Procedia PDF Downloads 401724 Experimental Quantification and Modeling of Dissolved Gas during Hydrate Crystallization: CO₂ Hydrate Case
Authors: Amokrane Boufares, Elise Provost, Veronique Osswald, Pascal Clain, Anthony Delahaye, Laurence Fournaison, Didier Dalmazzone
Abstract:
Gas hydrates have long been considered as problematic for flow assurance in natural gas and oil transportation. On the other hand, they are now seen as future promising materials for various applications (i.e. desalination of seawater, natural gas and hydrogen storage, gas sequestration, gas combustion separation and cold storage and transport). Nonetheless, a better understanding of the crystallization mechanism of gas hydrate and of their formation kinetics is still needed for a better comprehension and control of the process. To that purpose, measuring the real-time evolution of the dissolved gas concentration in the aqueous phase during hydrate formation is required. In this work, CO₂ hydrates were formed in a stirred reactor equipped with an Attenuated Total Reflection (ATR) probe coupled to a Fourier Transform InfraRed (FTIR) spectroscopy analyzer. A method was first developed to continuously measure in-situ the CO₂ concentration in the liquid phase during solubilization, supersaturation, hydrate crystallization and dissociation steps. Thereafter, the measured concentration data were compared with those of equilibrium concentrations. It was observed that the equilibrium is instantly reached in the liquid phase due to the fast consumption of dissolved gas by the hydrate crystallization. Consequently, it was shown that hydrate crystallization kinetics is limited by the gas transfer at the gas-liquid interface. Finally, we noticed that the liquid-hydrate equilibrium during the hydrate crystallization is governed by the temperature of the experiment under the tested conditions.Keywords: gas hydrate, dissolved gas, crystallization, infrared spectroscopy
Procedia PDF Downloads 282