Search results for: real anthropometric database
5769 Recognition of Tifinagh Characters with Missing Parts Using Neural Network
Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui
Abstract:
In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN
Procedia PDF Downloads 3355768 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 3745767 Adjustable Aperture with Liquid Crystal for Real-Time Range Sensor
Authors: Yumee Kim, Seung-Guk Hyeon, Kukjin Chun
Abstract:
An adjustable aperture using a liquid crystal is proposed for real-time range detection and obtaining images simultaneously. The adjustable aperture operates as two types of aperture stops which can create two different Depth of Field images. By analyzing these two images, the distance can be extracted from camera to object. Initially, the aperture stop has large size with zero voltage. When the input voltage is applied, the aperture stop transfer to smaller size by orientational transition of liquid crystal molecules in the device. The diameter of aperture stop is 1.94mm and 1.06mm. The proposed device has low driving voltage of 7.0V and fast response time of 6.22m. Compact size aperture of 6×6×1.1 mm3 is assembled in conventional camera which contain 1/3” HD image sensor and focal length of 3.3mm that can be used in autonomous. The measured range was up to 5m. The adjustable aperture has high stability due to no mechanically moving parts. This range sensor can be applied to the various field of 3D depth map application which is the Advanced Driving Assistance System (ADAS), drones and manufacturing machine.Keywords: adjustable aperture, dual aperture, liquid crystal, ranging and imaging, ADAS, range sensor
Procedia PDF Downloads 3815766 Bayesian Estimation under Different Loss Functions Using Gamma Prior for the Case of Exponential Distribution
Authors: Md. Rashidul Hasan, Atikur Rahman Baizid
Abstract:
The Bayesian estimation approach is a non-classical estimation technique in statistical inference and is very useful in real world situation. The aim of this paper is to study the Bayes estimators of the parameter of exponential distribution under different loss functions and then compared among them as well as with the classical estimator named maximum likelihood estimator (MLE). In our real life, we always try to minimize the loss and we also want to gather some prior information (distribution) about the problem to solve it accurately. Here the gamma prior is used as the prior distribution of exponential distribution for finding the Bayes estimator. In our study, we also used different symmetric and asymmetric loss functions such as squared error loss function, quadratic loss function, modified linear exponential (MLINEX) loss function and non-linear exponential (NLINEX) loss function. Finally, mean square error (MSE) of the estimators are obtained and then presented graphically.Keywords: Bayes estimator, maximum likelihood estimator (MLE), modified linear exponential (MLINEX) loss function, Squared Error (SE) loss function, non-linear exponential (NLINEX) loss function
Procedia PDF Downloads 3855765 Bringing Design Science Research Methodology into Real World Applications
Authors: Maya Jaber
Abstract:
In today's ever-changing world, organizational leaders will need to transform their organizations to meet the demands they face from employees, consumers, local and federal governments, and the global market. Change agents and leaders will need a new paradigm of thinking for creative problem solving and innovation in a time of uncertainty. A new framework that is developed from Design Science Research foundations with holistic design thinking methodologies (HTDM) and action research approaches has been developed through Dr. Jaber’s research. It combines these philosophies into a three-step process that can be utilized in practice for any sustainability, change, or project management applications. This framework was developed to assist in the pedagogy for the implementation of her holistic strategy formalized framework Integral Design Thinking (IDT). Her work focuses on real world application for the streamlining and adoption of initiatives into organizational culture transformation. This paper will discuss the foundations of this philosophy and the methods for utilization in practice developed in Dr. Jaber's research.Keywords: design science research, action research, critical thinking, design thinking, organizational transformation, sustainability management, organizational culture change
Procedia PDF Downloads 1825764 Evaluation of the Role of Circulating Long Non-Coding RNA H19 as a Promising Biomarker in Plasma of Patients with Gastric Cancer
Authors: Doaa Hashad, Amany Elbanna, Abeer Ibrahim, Gihan Khedr
Abstract:
Background: H19 is one of the long non coding RNAs (LncRNA) that is related to the progression of many diseases including cancers. This work was carried out to study the level of the long non-coding RNA; H119, in plasma of patients with gastric cancer (GC) and to assess its significance in their clinical management. Methods: A total of sixty-two participants were enrolled in the present study. The first group included thirty-two GC patients, while the second group was formed of thirty age and sex matched healthy volunteers serving as a control group. Plasma samples were used to assess H19 gene expression using real time quantitative PCR technique. Results: H19 expression was up-regulated in GC patients with positive correlation to TNM cancer stages. Conclusions: Up-regulation of H19 is closely associated with gastric cancer and correlates well with tumor staging. Convenient, efficient quantification of H19 in plasma using real time PCR technique implements its role as a potential noninvasive prognostic biomarker in gastric cancer, that predicts patient’s outcome and most importantly as a novel target in gastric cancer treatment with better performance achieved on using both CEA and H19 simultaneously.Keywords: biomarker, gastric, cancer, LncRNA
Procedia PDF Downloads 3195763 Optimizing The Residential Design Process Using Automated Technologies
Authors: Martin Georgiev, Milena Nanova, Damyan Damov
Abstract:
Architects, engineers, and developers need to analyse and implement a wide spectrum of data in different formats, if they want to produce viable residential developments. Usually, this data comes from a number of different sources and is not well structured. The main objective of this research project is to provide parametric tools working with real geodesic data that can generate residential solutions. Various codes, regulations and design constraints are described by variables and prioritized. In this way, we establish a common workflow for architects, geodesists, and other professionals involved in the building and investment process. This collaborative medium ensures that the generated design variants conform to various requirements, contributing to a more streamlined and informed decision-making process. The quantification of distinctive characteristics inherent to typical residential structures allows a systematic evaluation of the generated variants, focusing on factors crucial to designers, such as daylight simulation, circulation analysis, space utilization, view orientation, etc. Integrating real geodesic data offers a holistic view of the built environment, enhancing the accuracy and relevance of the design solutions. The use of generative algorithms and parametric models offers high productivity and flexibility of the design variants. It can be implemented in more conventional CAD and BIM workflow. Experts from different specialties can join their efforts, sharing a common digital workspace. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the building investment during its entire lifecycle.Keywords: architectural design, residential buildings, urban development, geodesic data, generative design, parametric models, workflow optimization
Procedia PDF Downloads 555762 From Binary Solutions to Real Bio-Oils: A Multi-Step Extraction Story of Phenolic Compounds with Ionic Liquid
Authors: L. Cesari, L. Canabady-Rochelle, F. Mutelet
Abstract:
The thermal conversion of lignin produces bio-oils that contain many compounds with high added-value such as phenolic compounds. In order to efficiently extract these compounds, the possible use of choline bis(trifluoromethylsulfonyl)imide [Choline][NTf2] ionic liquid was explored. To this end, a multistep approach was implemented. First, binary (phenolic compound and solvent) and ternary (phenolic compound and solvent and ionic liquid) solutions were investigated. Eight binary systems of phenolic compound and water were investigated at atmospheric pressure. These systems were quantified using the turbidity method and UV-spectroscopy. Ternary systems (phenolic compound and water and [Choline][NTf2]) were investigated at room temperature and atmospheric pressure. After stirring, the solutions were let to settle down, and a sample of each phase was collected. The analysis of the phases was performed using gas chromatography with an internal standard. These results were used to quantify the values of the interaction parameters of thermodynamic models. Then, extractions were performed on synthetic solutions to determine the influence of several operating conditions (temperature, kinetics, amount of [Choline][NTf2]). With this knowledge, it has been possible to design and simulate an extraction process composed of one extraction column and one flash. Finally, the extraction efficiency of [Choline][NTf2] was quantified with real bio-oils from lignin pyrolysis. Qualitative and quantitative analysis were performed using gas chromatographic connected to mass spectroscopy and flame ionization detector. The experimental measurements show that the extraction of phenolic compounds is efficient at room temperature, quick and does not require a high amount of [Choline][NTf2]. Moreover, the simulations of the extraction process demonstrate that [Choline][NTf2] process requires less energy than an organic one. Finally, the efficiency of [Choline][NTf2] was confirmed in real situations with the experiments on lignin pyrolysis bio-oils.Keywords: bio-oils, extraction, lignin, phenolic compounds
Procedia PDF Downloads 1105761 A Robust Optimization Model for Multi-Objective Closed-Loop Supply Chain
Authors: Mohammad Y. Badiee, Saeed Golestani, Mir Saman Pishvaee
Abstract:
In recent years consumers and governments have been pushing companies to design their activities in such a way as to reduce negative environmental impacts by producing renewable product or threat free disposal policy more and more. It is therefore important to focus more accurate to the optimization of various aspect of total supply chain. Modeling a supply chain can be a challenging process due to the fact that there are a large number of factors that need to be considered in the model. The use of multi-objective optimization can lead to overcome those problems since more information is used when designing the model. Uncertainty is inevitable in real world. Considering uncertainty on parameters in addition to use multi-objectives are ways to give more flexibility to the decision making process since the process can take into account much more constraints and requirements. In this paper we demonstrate a stochastic scenario based robust model to cope with uncertainty in a closed-loop multi-objective supply chain. By applying the proposed model in a real world case, the power of proposed model in handling data uncertainty is shown.Keywords: supply chain management, closed-loop supply chain, multi-objective optimization, goal programming, uncertainty, robust optimization
Procedia PDF Downloads 4185760 Water Detection in Aerial Images Using Fuzzy Sets
Authors: Caio Marcelo Nunes, Anderson da Silva Soares, Gustavo Teodoro Laureano, Clarimar Jose Coelho
Abstract:
This paper presents a methodology to pixel recognition in aerial images using fuzzy $c$-means algorithm. This algorithm is a alternative to recognize areas considering uncertainties and inaccuracies. Traditional clustering technics are used in recognizing of multispectral images of earth's surface. This technics recognize well-defined borders that can be easily discretized. However, in the real world there are many areas with uncertainties and inaccuracies which can be mapped by clustering algorithms that use fuzzy sets. The methodology presents in this work is applied to multispectral images obtained from Landsat-5/TM satellite. The pixels are joined using the $c$-means algorithm. After, a classification process identify the types of surface according the patterns obtained from spectral response of image surface. The classes considered are, exposed soil, moist soil, vegetation, turbid water and clean water. The results obtained shows that the fuzzy clustering identify the real type of the earth's surface.Keywords: aerial images, fuzzy clustering, image processing, pattern recognition
Procedia PDF Downloads 4845759 The Power House of Mind: Determination of Action
Authors: Sheetla Prasad
Abstract:
The focus issue of this article is to determine the mechanism of mind with geometrical analysis of human face. Research paradigm has been designed for study of spatial dynamic of face and it was found that different shapes of face have their own function for determine the action of mind. The functional ratio (FR) of face has determined the behaviour operation of human beings. It is not based on the formulistic approach of prediction but scientific dogmatism and mathematical analysis is the root of the prediction of behaviour. For analysis, formulae were developed and standardized. It was found that human psyche is designed in three forms; manipulated, manifested and real psyche. Functional output of the psyche has been determined by degree of energy flow in the psyche and reserve energy for future. Face is the recipient and transmitter of energy but distribution and control is the possible by mind. Mind directs behaviour. FR indicates that the face is a power house of energy and as per its geometrical domain force of behaviours has been designed and actions are possible in the nature of individual. The impact factor of this study is the promotion of human capital for job fitness objective and minimization of criminalization in society.Keywords: functional ratio, manipulated psyche, manifested psyche, real psyche
Procedia PDF Downloads 4545758 Prevalence of the Double Burden of Malnutrition in Women of Childbearing Age in Morocco: Coexistence of Iron Deficiency Anemia and Overweight
Authors: Fall Abdourahmane, Lazrak Meryem, El Hsaini Houda, El Ammari Laila, Gamih Hasnae, Yahyane Abdelhakim, Benjouad Abdelaziz, Aguenaou Hassan, El Kari Khalid
Abstract:
Introduction: The double burden of malnutrition (DBM), characterized by the coexistence of undernutrition and overnutrition, is a significant health challenge, particularly in low- and middle-income countries. In Morocco, 61.3% of women of reproductive age (WRA) are overweight or obese, including 30.4% who were obese, while 34.4% were anaemic, and 49.7% have iron deficiency anaemia. Objective: This study aims to determine the prevalence of DBM at the individual level among Moroccan WRA, defined by the coexistence of iron deficiency anaemia and overweight/obesity. Methods: a cross-sectional national survey was conducted among a representative sample of 2090 Moroccan WRA. Data collected included socio-economic parameters, anthropometric measurements, and blood samples. Haemoglobin levels were measured photometrically using Hemocue, while ferritin and CRP were assessed through immunoturbudimetry. Results: The prevalence of overweight/obesity, iron deficiency, anaemia and iron deficiency anaemia among WRA in Morocco were 60.2%, 30.6%, 34.4% and 50.0% respectively. The coexistence of overweight/obesity with anaemia and iron deficiency was observed in 19.2% and 16.3% of women, respectively. Among overweight/obese women, 32.5% were anaemic, 28.4% were iron deficient, and 47.6% had iron deficiency anaemia. the prevalence of DBM was higher in urban areas compared to rural settings. Conclusion: The coexistence of undernutrition and overnutrition among WRA highlights the urgent need for integrated public health interventions addressing both anaemia and obesity simultaneously. Tailored strategies should consider the specific socio-economic and geographical contexts to effectively combat this dual burden.Keywords: the double burden of malnutrition, iron deficiency anaemia, overweight, obesity
Procedia PDF Downloads 375757 Development of a Real-Time Brain-Computer Interface for Interactive Robot Therapy: An Exploration of EEG and EMG Features during Hypnosis
Authors: Maryam Alimardani, Kazuo Hiraki
Abstract:
This study presents a framework for development of a new generation of therapy robots that can interact with users by monitoring their physiological and mental states. Here, we focused on one of the controversial methods of therapy, hypnotherapy. Hypnosis has shown to be useful in treatment of many clinical conditions. But, even for healthy people, it can be used as an effective technique for relaxation or enhancement of memory and concentration. Our aim is to develop a robot that collects information about user’s mental and physical states using electroencephalogram (EEG) and electromyography (EMG) signals and performs costeffective hypnosis at the comfort of user’s house. The presented framework consists of three main steps: (1) Find the EEG-correlates of mind state before, during, and after hypnosis and establish a cognitive model for state changes, (2) Develop a system that can track the changes in EEG and EMG activities in real time and determines if the user is ready for suggestion, and (3) Implement our system in a humanoid robot that will talk and conduct hypnosis on users based on their mental states. This paper presents a pilot study in regard to the first stage, detection of EEG and EMG features during hypnosis.Keywords: hypnosis, EEG, robotherapy, brain-computer interface (BCI)
Procedia PDF Downloads 2575756 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 205755 Mobi-DiQ: A Pervasive Sensing System for Delirium Risk Assessment in Intensive Care Unit
Authors: Subhash Nerella, Ziyuan Guan, Azra Bihorac, Parisa Rashidi
Abstract:
Intensive care units (ICUs) provide care to critically ill patients in severe and life-threatening conditions. However, patient monitoring in the ICU is limited by the time and resource constraints imposed on healthcare providers. Many critical care indices such as mobility are still manually assessed, which can be subjective, prone to human errors, and lack granularity. Other important aspects, such as environmental factors, are not monitored at all. For example, critically ill patients often experience circadian disruptions due to the absence of effective environmental “timekeepers” such as the light/dark cycle and the systemic effect of acute illness on chronobiologic markers. Although the occurrence of delirium is associated with circadian disruption risk factors, these factors are not routinely monitored in the ICU. Hence, there is a critical unmet need to develop systems for precise and real-time assessment through novel enabling technologies. We have developed the mobility and circadian disruption quantification system (Mobi-DiQ) by augmenting biomarker and clinical data with pervasive sensing data to generate mobility and circadian cues related to mobility, nightly disruptions, and light and noise exposure. We hypothesize that Mobi-DiQ can provide accurate mobility and circadian cues that correlate with bedside clinical mobility assessments and circadian biomarkers, ultimately important for delirium risk assessment and prevention. The collected multimodal dataset consists of depth images, Electromyography (EMG) data, patient extremity movement captured by accelerometers, ambient light levels, Sound Pressure Level (SPL), and indoor air quality measured by volatile organic compounds, and the equivalent CO₂ concentration. For delirium risk assessment, the system recognizes mobility cues (axial body movement features and body key points) and circadian cues, including nightly disruptions, ambient SPL, and light intensity, as well as other environmental factors such as indoor air quality. The Mobi-DiQ system consists of three major components: the pervasive sensing system, a data storage and analysis server, and a data annotation system. For data collection, six local pervasive sensing systems were deployed, including a local computer and sensors. A video recording tool with graphical user interface (GUI) developed in python was used to capture depth image frames for analyzing patient mobility. All sensor data is encrypted, then automatically uploaded to the Mobi-DiQ server through a secured VPN connection. Several data pipelines are developed to automate the data transfer, curation, and data preparation for annotation and model training. The data curation and post-processing are performed on the server. A custom secure annotation tool with GUI was developed to annotate depth activity data. The annotation tool is linked to the MongoDB database to record the data annotation and to provide summarization. Docker containers are also utilized to manage services and pipelines running on the server in an isolated manner. The processed clinical data and annotations are used to train and develop real-time pervasive sensing systems to augment clinical decision-making and promote targeted interventions. In the future, we intend to evaluate our system as a clinical implementation trial, as well as to refine and validate it by using other data sources, including neurological data obtained through continuous electroencephalography (EEG).Keywords: deep learning, delirium, healthcare, pervasive sensing
Procedia PDF Downloads 935754 Optimizing Logistics for Courier Organizations with Considerations of Congestions and Pickups: A Courier Delivery System in Amman as Case Study
Authors: Nader A. Al Theeb, Zaid Abu Manneh, Ibrahim Al-Qadi
Abstract:
Traveling salesman problem (TSP) is a combinatorial integer optimization problem that asks "What is the optimal route for a vehicle to traverse in order to deliver requests to a given set of customers?”. It is widely used by the package carrier companies’ distribution centers. The main goal of applying the TSP in courier organizations is to minimize the time that it takes for the courier in each trip to deliver or pick up the shipments during a day. In this article, an optimization model is constructed to create a new TSP variant to optimize the routing in a courier organization with a consideration of congestion in Amman, the capital of Jordan. Real data were collected by different methods and analyzed. Then, concert technology - CPLEX was used to solve the proposed model for some random generated data instances and for the real collected data. At the end, results have shown a great improvement in time compared with the current trip times, and an economic study was conducted afterwards to figure out the impact of using such models.Keywords: travel salesman problem, congestions, pick-up, integer programming, package carriers, service engineering
Procedia PDF Downloads 4315753 Optical Flow Based System for Cross Traffic Alert
Authors: Giuseppe Spampinato, Salvatore Curti, Ivana Guarneri, Arcangelo Bruna
Abstract:
This document describes an advanced system and methodology for Cross Traffic Alert (CTA), able to detect vehicles that move into the vehicle driving path from the left or right side. The camera is supposed to be not only on a vehicle still, e.g. at a traffic light or at an intersection, but also moving slowly, e.g. in a car park. In all of the aforementioned conditions, a driver’s short loss of concentration or distraction can easily lead to a serious accident. A valid support to avoid these kinds of car crashes is represented by the proposed system. It is an extension of our previous work, related to a clustering system, which only works on fixed cameras. Just a vanish point calculation and simple optical flow filtering, to eliminate motion vectors due to the car relative movement, is performed to let the system achieve high performances with different scenarios, cameras and resolutions. The proposed system just uses as input the optical flow, which is hardware implemented in the proposed platform and since the elaboration of the whole system is really speed and power consumption, it is inserted directly in the camera framework, allowing to execute all the processing in real-time.Keywords: clustering, cross traffic alert, optical flow, real time, vanishing point
Procedia PDF Downloads 2035752 Democracy as a Curve: A Study on How Democratization Impacts Economic Growth
Authors: Henrique Alpalhão
Abstract:
This paper attempts to model the widely studied relationship between a country's economic growth and its level of democracy, with an emphasis on possible non-linearities. We adopt the concept of 'political capital' as a measure of democracy, which is extremely uncommon in the literature and brings considerable advantages both in terms of dynamic considerations and plausibility. While the literature is not consensual on this matter, we obtain, via panel Arellano-Bond regression analysis on a database of more than 60 countries over 50 years, significant and robust results that indicate that the impact of democratization on economic growth varies according to the stage of democratic development each country is in.Keywords: democracy, economic growth, political capital, political economy
Procedia PDF Downloads 3225751 Applying Augmented Reality Technology for an E-Learning System
Authors: Fetoon K. Algarawi, Wejdan A. Alslamah, Ahlam A. Alhabib, Afnan S. Alfehaid, Dina M. Ibrahim
Abstract:
Over the past 20 years, technology was rapidly developed and no one expected what will come next. Advancements in technology open new opportunities for immersive learning environments. There is a need to transmit education to a level that makes it more effective for the student. Augmented reality is one of the most popular technologies these days. This paper is an experience of applying Augmented Reality (AR) technology using a marker-based approach in E-learning system to transmitting virtual objects into the real-world scenes. We present a marker-based approach for transmitting virtual objects into real-world scenes to explain information in a better way after we developed a mobile phone application. The mobile phone application was then tested on students to determine the extent to which it encouraged them to learn and understand the subjects. In this paper, we talk about how the beginnings of AR, the fields using AR, how AR is effective in education, the spread of AR these days and the architecture of our work. Therefore, the aim of this paper is to prove how creating an interactive e-learning system using AR technology will encourage students to learn more.Keywords: augmented reality, e-learning, marker-based, monitor-based
Procedia PDF Downloads 2235750 Use the Null Space to Create Starting Point for Stochastic Programming
Authors: Ghussoun Al-Jeiroudi
Abstract:
Stochastic programming is one of the powerful technique which is used to solve real-life problems. Hence, the data of real-life problems is subject to significant uncertainty. Uncertainty is well studied and modeled by stochastic programming. Each day, problems become bigger and bigger and the need for a tool, which does deal with large scale problems, increase. Interior point method is a perfect tool to solve such problems. Interior point method is widely employed to solve the programs, which arise from stochastic programming. It is an iterative technique, so it is required a starting point. Well design starting point plays an important role in improving the convergence speed. In this paper, we propose a starting point for interior point method for multistage stochastic programming. Usually, the optimal solution of stage k+1 is used as starting point for the stage k. This point has the advantage of being close to the solution of the current program. However, it has a disadvantage; it is not in the feasible region of the current program. So, we suggest to take this point and modifying it. That is by adding to it a vector in the null space of the matrix of the unchanged constraints because the solution will change only in the null space of this matrix.Keywords: interior point methods, stochastic programming, null space, starting points
Procedia PDF Downloads 4205749 Role of Zinc in Catch-Up Growth of Low-Birth Weight Neonates
Authors: M. A. Abdel-Wahed, Nayera Elmorsi Hassan, Safaa Shafik Imam, Ola G. El-Farghali, Khadija M. Alian
Abstract:
Low-birth-weight is a challenging public health problem. Aim: to clarify role of zinc on enhancing catch-up growth of low-birth-weight and find out a proposed relationship between zinc effect on growth and the main growth hormone mediator, IGF-1. Methods: Study is a double-blind-randomized-placebo-controlled trial conducted on low-birth-weight-neonates delivered at Ain Shams University Maternity Hospital. It comprised 200 Low-birth-weight-neonates selected from those admitted to NICU. Neonates were randomly allocated into one of the following two groups: group I: low-birth-weight; AGA or SGA on oral zinc therapy at dose of 10 mg/day; group II: Low-birth-weight; AGA or SGA on placebo. Anthropometric measurements were taken including birth weight, length; head, waist, chest, mid-upper arm circumferences, triceps and sub-scapular skin-fold thicknesses. Results: At 12-month-old follow-up visit, mean weight, length; head (HC), waist, chest, mid-upper arm circumferences and triceps; also, infant’s proportions had values ≥ 10th percentile for weight, length and HC were significantly higher among infants of group I when compared to those of group II. Oral zinc therapy was associated with 24.88%, 25.98% and 19.6% higher proportion of values ≥ 10th percentile regarding weight, length and HC at 12-month-old visit, respectively [NNT = 4, 4 and 5, respectively]. Median IGF-1 levels measured at 6 months were significantly higher in group I compared to group II (median (range): 90 (19 – 130) ng/ml vs. 74 (21 – 130) ng/ml, respectively, p=0.023). Conclusion: Oral zinc therapy in low-birth-weight neonates was associated with significantly more catch-up growth at 12-months-old and significantly higher serum IGF-1 at 6-month-old.Keywords: low-birth-weight, zinc, catch-up growth, neonates
Procedia PDF Downloads 4175748 Equivalent Circuit Representation of Lossless and Lossy Power Transmission Systems Including Discrete Sampler
Authors: Yuichi Kida, Takuro Kida
Abstract:
In a new smart society supported by the recent development of 5G and 6G Communication systems, the im- portance of wireless power transmission is increasing. These systems contain discrete sampling systems in the middle of the transmission path and equivalent circuit representation of lossless or lossy power transmission through these systems is an important issue in circuit theory. In this paper, for the given weight function, we show that a lossless power transmission system with the given weight is expressed by an equivalent circuit representation of the Kida’s optimal signal prediction system followed by a reactance multi-port circuit behind it. Further, it is shown that, when the system is lossy, the system has an equivalent circuit in the form of connecting a multi-port positive-real circuit behind the Kida’s optimal signal prediction system. Also, for the convenience of the reader, in this paper, the equivalent circuit expression of the reactance multi-port circuit and the positive- real multi-port circuit by Cauer and Ohno, whose information is currently being lost even in the world of the Internet.Keywords: signal prediction, pseudo inverse matrix, artificial intelligence, power transmission
Procedia PDF Downloads 1235747 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis
Authors: Mayada Attia Ibrahim
Abstract:
Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis
Procedia PDF Downloads 1005746 Modeling of the Flow through an Earth Dam and Geotechnical Slope Analyzes
Authors: Ahmed Ferhati, Arezki Adjrad, Ratiba Mitiche-Kettab, Hakim Djafer Khodja
Abstract:
The porous media are omnipresent around us that they are natural as sand, clay, rocks, or manufactured like concretes, cement, and ceramics. The variety of porous environment indicates a wide material range which can be very different from each other. Their common point is to be made up of a solid matrix and a porous space. In our case of study, we made the modeling of the flows in porous environments through the massives as in the case of an earth dam. The computer code used (PLAXIS) offer the possibility of modeling of various structures, in particular, the works in lands because that it deals with the pore water pressure due to the underground flow and the calculation of the plastic deformations. To confirm results obtained by PLAXIS, GeoStudio SEEP/W code was used. This work treats modeling of flows and mechanical and hydraulic behavior of earth dam. A general framework which can fit the calculation of this kind of structures and the coupling of the soil consolidation and free surface flows was defined. In this study; we have confronted a real case modeling of an earth dam. It was shown, in particular, that it is possible to entirely lead the calculation of real dam and to get encouraging results from the hydraulic and mechanical point of view.Keywords: analyzes, dam, flow, modeling, PLAXIS, seep/w, slope
Procedia PDF Downloads 3105745 Body Mass Components in Young Soccer Players
Authors: Elizabeta Sivevska, Sunchica Petrovska, Vaska Antevska, Lidija Todorovska, Sanja Manchevska, Beti Dejanova, Ivanka Karagjozova, Jasmina Pluncevic Gligoroska
Abstract:
Introduction: Body composition plays an important role in the selection of young soccer players and it is associated with their successful performance. The most commonly used model of body composition divides the body into two compartments: fat components and fat-free mass (muscular and bone components). The aims of the study were to determine the body composition parameters of young male soccer players and to show the differences in age groups. Material and methods: A sample of 52 young male soccer players, with an age span from 9 to 14 years were divided into two groups according to the age (group 1 aged 9 to 12 years and group 2 aged 12 to 14 years). Anthropometric measurements were taken according to the method of Mateigka. The following measurements were made: body weight, body height, circumferences (arm, forearm, thigh and calf), diameters (elbow, knee, wrist, ankle) and skinfold thickness (biceps, triceps, thigh, leg, chest, abdomen). The measurements were used in Mateigka’s equations. Results: Body mass components were analyzed as absolute values (in kilograms) and as percentage values: the muscular component (MC kg and MC%), the bone component (BCkg and BC%) and the body fat (BFkg and BF%). The group up to 12 years showed the following mean values of the analyzed parameters: MM=21.5kg; MM%=46.3%; BC=8.1kg; BC%=19.1%; BF= 6.3kg; BF%= 15.7%. The second group aged 12-14 year had mean values of body composition parameters as follows: MM=25.6 kg; MM%=48.2%; BC = 11.4 kg; BC%=21.6%; BF= 8.5 kg; BF%= 14. 7%. Conclusions: The young soccer players aged 12 up to 14 years who are in the pre-pubertal phase of growth and development had higher bone component (p<0.05) compared to younger players. There is no significant difference in muscular and fat body component between the two groups of young soccer players.Keywords: body composition, young soccer players, body fat, fat-free mass
Procedia PDF Downloads 4585744 Development of Requirements Analysis Tool for Medical Autonomy in Long-Duration Space Exploration Missions
Authors: Lara Dutil-Fafard, Caroline Rhéaume, Patrick Archambault, Daniel Lafond, Neal W. Pollock
Abstract:
Improving resources for medical autonomy of astronauts in prolonged space missions, such as a Mars mission, requires not only technology development, but also decision-making support systems. The Advanced Crew Medical System - Medical Condition Requirements study, funded by the Canadian Space Agency, aimed to create knowledge content and a scenario-based query capability to support medical autonomy of astronauts. The key objective of this study was to create a prototype tool for identifying medical infrastructure requirements in terms of medical knowledge, skills and materials. A multicriteria decision-making method was used to prioritize the highest risk medical events anticipated in a long-term space mission. Starting with those medical conditions, event sequence diagrams (ESDs) were created in the form of decision trees where the entry point is the diagnosis and the end points are the predicted outcomes (full recovery, partial recovery, or death/severe incapacitation). The ESD formalism was adapted to characterize and compare possible outcomes of medical conditions as a function of available medical knowledge, skills, and supplies in a given mission scenario. An extensive literature review was performed and summarized in a medical condition database. A PostgreSQL relational database was created to allow query-based evaluation of health outcome metrics with different medical infrastructure scenarios. Critical decision points, skill and medical supply requirements, and probable health outcomes were compared across chosen scenarios. The three medical conditions with the highest risk rank were acute coronary syndrome, sepsis, and stroke. Our efforts demonstrate the utility of this approach and provide insight into the effort required to develop appropriate content for the range of medical conditions that may arise.Keywords: decision support system, event-sequence diagram, exploration mission, medical autonomy, scenario-based queries, space medicine
Procedia PDF Downloads 1285743 Development of Tools for Multi Vehicles Simulation with Robot Operating System and ArduPilot
Authors: Pierre Kancir, Jean-Philippe Diguet, Marc Sevaux
Abstract:
One of the main difficulties in developing multi-robot systems (MRS) is related to the simulation and testing tools available. Indeed, if the differences between simulations and real robots are too significant, the transition from the simulation to the robot won’t be possible without another long development phase and won’t permit to validate the simulation. Moreover, the testing of different algorithmic solutions or modifications of robots requires a strong knowledge of current tools and a significant development time. Therefore, the availability of tools for MRS, mainly with flying drones, is crucial to enable the industrial emergence of these systems. This research aims to present the most commonly used tools for MRS simulations and their main shortcomings and presents complementary tools to improve the productivity of designers in the development of multi-vehicle solutions focused on a fast learning curve and rapid transition from simulations to real usage. The proposed contributions are based on existing open source tools as Gazebo simulator combined with ROS (Robot Operating System) and the open-source multi-platform autopilot ArduPilot to bring them to a broad audience.Keywords: ROS, ArduPilot, MRS, simulation, drones, Gazebo
Procedia PDF Downloads 2115742 An Analysis System for Integrating High-Throughput Transcript Abundance Data with Metabolic Pathways in Green Algae
Authors: Han-Qin Zheng, Yi-Fan Chiang-Hsieh, Chia-Hung Chien, Wen-Chi Chang
Abstract:
As the most important non-vascular plants, algae have many research applications, including high species diversity, biofuel sources, adsorption of heavy metals and, following processing, health supplements. With the increasing availability of next-generation sequencing (NGS) data for algae genomes and transcriptomes, an integrated resource for retrieving gene expression data and metabolic pathway is essential for functional analysis and systems biology in algae. However, gene expression profiles and biological pathways are displayed separately in current resources, and making it impossible to search current databases directly to identify the cellular response mechanisms. Therefore, this work develops a novel AlgaePath database to retrieve gene expression profiles efficiently under various conditions in numerous metabolic pathways. AlgaePath, a web-based database, integrates gene information, biological pathways, and next-generation sequencing (NGS) datasets in Chlamydomonasreinhardtii and Neodesmus sp. UTEX 2219-4. Users can identify gene expression profiles and pathway information by using five query pages (i.e. Gene Search, Pathway Search, Differentially Expressed Genes (DEGs) Search, Gene Group Analysis, and Co-Expression Analysis). The gene expression data of 45 and 4 samples can be obtained directly on pathway maps in C. reinhardtii and Neodesmus sp. UTEX 2219-4, respectively. Genes that are differentially expressed between two conditions can be identified in Folds Search. Furthermore, the Gene Group Analysis of AlgaePath includes pathway enrichment analysis, and can easily compare the gene expression profiles of functionally related genes in a map. Finally, Co-Expression Analysis provides co-expressed transcripts of a target gene. The analysis results provide a valuable reference for designing further experiments and elucidating critical mechanisms from high-throughput data. More than an effective interface to clarify the transcript response mechanisms in different metabolic pathways under various conditions, AlgaePath is also a data mining system to identify critical mechanisms based on high-throughput sequencing.Keywords: next-generation sequencing (NGS), algae, transcriptome, metabolic pathway, co-expression
Procedia PDF Downloads 4075741 Management of the Experts in the Research Evaluation System of the University: Based on National Research University Higher School of Economics Example
Authors: Alena Nesterenko, Svetlana Petrikova
Abstract:
Research evaluation is one of the most important elements of self-regulation and development of researchers as it is impartial and independent process of assessment. The method of expert evaluations as a scientific instrument solving complicated non-formalized problems is firstly a scientifically sound way to conduct the assessment which maximum effectiveness of work at every step and secondly the usage of quantitative methods for evaluation, assessment of expert opinion and collective processing of the results. These two features distinguish the method of expert evaluations from long-known expertise widespread in many areas of knowledge. Different typical problems require different types of expert evaluations methods. Several issues which arise with these methods are experts’ selection, management of assessment procedure, proceeding of the results and remuneration for the experts. To address these issues an on-line system was created with the primary purpose of development of a versatile application for many workgroups with matching approaches to scientific work management. Online documentation assessment and statistics system allows: - To realize within one platform independent activities of different workgroups (e.g. expert officers, managers). - To establish different workspaces for corresponding workgroups where custom users database can be created according to particular needs. - To form for each workgroup required output documents. - To configure information gathering for each workgroup (forms of assessment, tests, inventories). - To create and operate personal databases of remote users. - To set up automatic notification through e-mail. The next stage is development of quantitative and qualitative criteria to form a database of experts. The inventory was made so that the experts may not only submit their personal data, place of work and scientific degree but also keywords according to their expertise, academic interests, ORCID, Researcher ID, SPIN-code RSCI, Scopus AuthorID, knowledge of languages, primary scientific publications. For each project, competition assessments are processed in accordance to ordering party demands in forms of apprised inventories, commentaries (50-250 characters) and overall review (1500 characters) in which expert states the absence of conflict of interest. Evaluation is conducted as follows: as applications are added to database expert officer selects experts, generally, two persons per application. Experts are selected according to the keywords; this method proved to be good unlike the OECD classifier. The last stage: the choice of the experts is approved by the supervisor, the e-mails are sent to the experts with invitation to assess the project. An expert supervisor is controlling experts writing reports for all formalities to be in place (time-frame, propriety, correspondence). If the difference in assessment exceeds four points, the third evaluation is appointed. As the expert finishes work on his expert opinion, system shows contract marked ‘new’, managers commence with the contract and the expert gets e-mail that the contract is formed and ready to be signed. All formalities are concluded and the expert gets remuneration for his work. The specificity of interaction of the examination officer with other experts will be presented in the report.Keywords: expertise, management of research evaluation, method of expert evaluations, research evaluation
Procedia PDF Downloads 2085740 Brief Guide to Cloud-Based AI Prototyping: Key Insights from Selected Case Studies Using Google Cloud Platform
Authors: Kamellia Reshadi, Pranav Ragji, Theodoros Soldatos
Abstract:
Recent advancements in cloud computing and storage, along with rapid progress in artificial intelligence (AI), have transformed approaches to developing efficient, scalable applications. However, integrating AI with cloud computing poses challenges as these fields are often disjointed, and many advancements remain difficult to access, obscured in complex documentation or scattered across research reports. For this reason, we share experiences from prototype projects combining these technologies. Specifically, we focus on Google Cloud Platform (GCP) functionalities and describe vision and speech activities applied to labeling, subtitling, and urban traffic flow tasks. We describe challenges, pricing, architecture, and other key features, considering the goal of real-time performance. We hope our demonstrations provide not only essential guidelines for using these functionalities but also enable more similar approaches.Keywords: artificial intelligence, cloud computing, real-time applications, case studies, knowledge management, research and development, text labeling, video annotation, urban traffic analysis, public safety, prototyping, Google Cloud Platform
Procedia PDF Downloads 23