Search results for: multivariate time series data
35088 A Real-Time Simulation Environment for Avionics Software Development and Qualification
Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Luca Garbarino, Urbano Tancredi, Domenico Accardo, Michele Grassi, Giancarmine Fasano, Anna Elena Tirri
Abstract:
The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.Keywords: ADS-B, avionics, NAVAIDs, real-time simulation, TCAS, UAS ground control station
Procedia PDF Downloads 22835087 Petrology Investigation of Apatite Minerals in the Esfordi Mine
Authors: Haleh Rezaei Zanjirabadi, Fatemeh Saberi, Bahman Rahimzadeh, Fariborz Masoudi, Mohammad Rahgosha
Abstract:
In this study, apatite minerals from the iron-phosphate deposit of Yazd have been investigated within the microcontinent zone of Iran in the Zagros structural zone. The geological units in the Esfordi area belong to the pre-Cambrian to lower-Cambrian age, consisting of a succession of carbonate rocks (dolomite), shale, tuff, sandstone, and volcanic rocks. In addition to the mentioned sedimentary and volcanic rocks, the granitoid mass of Bahabad, which is the largest intrusive mass in the region, has intruded into the eastern part of this series and has caused its metamorphism and alteration. After collecting the available data, various samples of Esfordi’s apatite were prepared, and their mineralogy and crystallography were investigated using laboratory methods such as petrographic microscopy, Raman spectroscopy, EDS, and SEM. In non-destructive Raman spectroscopy, the molecular structure of apatite minerals was revealed in four distinct spectral ranges. Initially, the spectra of phosphate and aluminum bonds with O2HO, OH, were observed, followed by the identification of Cl, OH, Al, Na, Ca and hydroxyl units depending on the type of apatite mineral family. In SEM analysis, based on various shapes and different phases of apatites, their constituent major elements were identified through EDS, indicating that the samples from the Esfordi mining area exhibit a dense and coherent texture with smooth surfaces. Based on the elemental analysis results by EDS, the apatites in the Esfordi area are classified into the calcic apatite group.Keywords: petrology, apatite, Esfordi, EDS, SEM, Raman spectroscopy
Procedia PDF Downloads 6135086 Minimizing Total Completion Time in No-Wait Flowshops with Setup Times
Authors: Ali Allahverdi
Abstract:
The m-machine no-wait flowshop scheduling problem is addressed in this paper. The objective is to minimize total completion time subject to the constraint that the makespan value is not greater than a certain value. Setup times are treated as separate from processing times. Several recent algorithms are adapted and proposed for the problem. An extensive computational analysis has been conducted for the evaluation of the proposed algorithms. The computational analysis indicates that the best proposed algorithm performs significantly better than the earlier existing best algorithm.Keywords: scheduling, no-wait flowshop, algorithm, setup times, total completion time, makespan
Procedia PDF Downloads 34035085 Automated Prepaid Billing Subscription System
Authors: Adekunle K. O, Adeniyi A. E, Kolawole E
Abstract:
One of the most dramatic trends in the communications market in recent years has been the growth of prepaid services. Today, prepaid no longer constitutes the low-revenue, basic-service segment. It is driven by a high margin, value-add service customers who view it as a convenient way of retaining control over their usage and communication spending while expecting high service levels. To service providers, prepaid services offer the advantage of reducing bad accounts while allowing them to predict usage and plan network resources. Yet, the real-time demands of prepaid services require a scalable, real-time platform to manage customers through their entire life cycle. It delivers integrated real-time rating, voucher management, recharge management, customer care and service provisioning for the generation of new prepaid services. It carries high scalability that can handle millions of prepaid customers in real-time through their entire life cycle.Keywords: prepaid billing, voucher management, customers, automated, security
Procedia PDF Downloads 11535084 Sourcing and Compiling a Maltese Traffic Dataset MalTra
Authors: Gabriele Borg, Alexei De Bono, Charlie Abela
Abstract:
There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale.Keywords: Big Data, vehicular traffic, traffic management, mobile data patterns
Procedia PDF Downloads 10935083 In vivo Determination of Anticoagulant Property of the Tentacle Extract of Aurelia aurita (Moon Jellyfish) Using Sprague-Dawley Rats
Authors: Bea Carmel H. Casiding, Charmaine A. Guy, Funny Jovis P. Malasan, Katrina Chelsea B. Manlutac, Danielle Ann N. Novilla, Marianne R. Oliveros, Magnolia C. Sibulo
Abstract:
Moon jellyfish, Aurelia aurita, has become a popular research organism for diverse studies. Recent studies have verified the prevention of blood clotting properties of the moon jellyfish tentacle extract through in vitro methods. The purpose of this study was to validate the blood clotting ability of A. aurita tentacle extract using in vivo method of experimentation. The tentacles of A. aurita jellyfish were excised and filtered then centrifuged at 3000xg for 10 minutes. The crude nematocyst extract was suspended in 1:6 ratios with phosphate buffer solution and sonicated for three periods of 20 seconds each at 50 Hz. Protein concentration of the extract was determined using Bradford Assay. Bovine serum albumin was the standard solution used with the following concentrations: 35.0, 70.0, 105.0, 140.0, 175.0, 210.0, 245.0, and 280.0 µg/mL. The absorbance was read at 595 nm. Toxicity testing from OECD guidelines was adapted. The extract suspended in phosphate-buffered saline solution was arbitrarily set into three doses (0.1mg/kg, 0.3mg/kg, 0.5mg/kg) and were administered daily for five days to the experimental groups of five male Sprague-Dawley rats (one dose per group). Before and after the administration period, bleeding time and clotting time tests were performed. The One-way Analysis of Variance (ANOVA) was used to analyze the difference of before and after bleeding time and clotting time from the three treatment groups, time, positive and negative control groups. The average protein concentration of the sonicated crude tentacle extract was 206.5 µg/mL. The highest dose administered (0.5mg/kg) produced significant increase in the time for both bleeding and clotting tests. However, the preceding lower dose (0.3mg/kg) only was significantly effective for clotting time test. The protein contained in the tentacle extract with a concentration of 206.5 mcg/mL and dose of 0.3 mg/kg and 0.5 mg/kg of A. aurita elicited anticoagulating activity.Keywords: anticoagulant, bleeding time test, clotting time test, moon jellyfish
Procedia PDF Downloads 39735082 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study
Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar
Abstract:
Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices
Procedia PDF Downloads 50735081 Short-Term Effects of Extreme Temperatures on Cause Specific Cardiovascular Admissions in Beijing, China
Authors: Deginet Aklilu, Tianqi Wang, Endwoke Amsalu, Wei Feng, Zhiwei Li, Xia Li, Lixin Tao, Yanxia Luo, Moning Guo, Xiangtong Liu, Xiuhua Guo
Abstract:
Extreme temperature-related cardiovascular diseases (CVDs) have become a growing public health concern. However, the impact of temperature on the cause of specific CVDs has not been well studied in the study area. The objective of this study was to assess the impact of temperature on cause-specific cardiovascular hospital admissions in Beijing, China. We obtained data from 172 large general hospitals from the Beijing Public Health Information Center Cardiovascular Case Database and China. Meteorological Administration covering 16 districts in Beijing from 2013 to 2017. We used a time-stratified case crossover design with a distributed lag nonlinear model (DLNM) to derive the impact of temperature on CVD in hospitals back to 27 days on CVD admissions. The temperature data were stratified as cold (extreme and moderate ) and hot (moderate and extreme ). Within five years (January 2013-December 2017), a total of 460,938 (male 54.9% and female 45.1%) CVD admission cases were reported. The exposure-response relationship for hospitalization was described by a "J" shape for the total and cause-specific. An increase in the six-day moving average temperature from moderate hot (30.2 °C) to extreme hot (36.9 °C) resulted in a significant increase in CVD admissions of 16.1%(95% CI = 12.8%-28.9%). However, the effect of cold temperature exposure on CVD admissions over a lag time of 0-27 days was found to be non significant, with a relative risk of 0.45 (95% CI = 0.378-0.55) for extreme cold (-8.5 °C)and 0.53 (95% CI = 0.47-0.60) for moderate cold (-5.6 °C). The results of this study indicate that exposure to extremely high temperatures is highly associated with an increase in cause-specific CVD admissions. These finding may guide to create and raise awareness of the general population, government and private sectors regarding on the effects of current weather conditions on CVD.Keywords: admission, Beijing, cardiovascular diseases, distributed lag non linear model, temperature
Procedia PDF Downloads 6235080 Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method
Authors: Amira Mabrouk, Chokri Abdennadher
Abstract:
The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.Keywords: willingness to pay, contingent valuation, time value, city toll
Procedia PDF Downloads 43435079 Application of Compressed Sensing and Different Sampling Trajectories for Data Reduction of Small Animal Magnetic Resonance Image
Authors: Matheus Madureira Matos, Alexandre Rodrigues Farias
Abstract:
Magnetic Resonance Imaging (MRI) is a vital imaging technique used in both clinical and pre-clinical areas to obtain detailed anatomical and functional information. However, MRI scans can be expensive, time-consuming, and often require the use of anesthetics to keep animals still during the imaging process. Anesthetics are commonly administered to animals undergoing MRI scans to ensure they remain still during the imaging process. However, prolonged or repeated exposure to anesthetics can have adverse effects on animals, including physiological alterations and potential toxicity. Minimizing the duration and frequency of anesthesia is, therefore, crucial for the well-being of research animals. In recent years, various sampling trajectories have been investigated to reduce the number of MRI measurements leading to shorter scanning time and minimizing the duration of animal exposure to the effects of anesthetics. Compressed sensing (CS) and sampling trajectories, such as cartesian, spiral, and radial, have emerged as powerful tools to reduce MRI data while preserving diagnostic quality. This work aims to apply CS and cartesian, spiral, and radial sampling trajectories for the reconstruction of MRI of the abdomen of mice sub-sampled at levels below that defined by the Nyquist theorem. The methodology of this work consists of using a fully sampled reference MRI of a female model C57B1/6 mouse acquired experimentally in a 4.7 Tesla MRI scanner for small animals using Spin Echo pulse sequences. The image is down-sampled by cartesian, radial, and spiral sampling paths and then reconstructed by CS. The quality of the reconstructed images is objectively assessed by three quality assessment techniques RMSE (Root mean square error), PSNR (Peak to Signal Noise Ratio), and SSIM (Structural similarity index measure). The utilization of optimized sampling trajectories and CS technique has demonstrated the potential for a significant reduction of up to 70% of image data acquisition. This result translates into shorter scan times, minimizing the duration and frequency of anesthesia administration and reducing the potential risks associated with it.Keywords: compressed sensing, magnetic resonance, sampling trajectories, small animals
Procedia PDF Downloads 7335078 A Longitudinal Survey Study of Izmir Commuter Rail System (IZBAN)
Authors: Samet Sen, Yalcin Alver
Abstract:
Before Izmir Commuter Rail System (IZBAN), most of the respondents along the railway were making their trips by city buses, minibuses or private cars. After IZBAN was put into service, some people changed their previous trip behaviors and they started travelling by IZBAN. Therefore a big travel demand in IZBAN occurred. In this study, the characteristics of passengers and their trip behaviors are found out based on the longitudinal data conducted via two wave trip surveys. Just after one year from IZBAN's opening, the first wave of the surveys was carried out among 539 passengers at six stations during morning peak hours between 07.00 am-09.30 am. The second wave was carried out among 669 passengers at the same six stations two years after the first wave during the same morning peak hours. As a result of this study, the respondents' socio-economic specifications, the distribution of trips by region, the impact of IZBAN on transport modes, the changes in travel time and travel cost and satisfaction data were obtained. These data enabled to compare two waves and explain the changes in socio-economic factors and trip behaviors. In both waves, 10 % of the respondents stopped driving their own cars and they started to take IZBAN. This is an important development in solving traffic problems. More public transportation means less traffic congestion.Keywords: commuter rail system, comparative study, longitudinal survey, public transportation
Procedia PDF Downloads 43435077 Technology in the Calculation of People Health Level: Design of a Computational Tool
Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna
Abstract:
Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.Keywords: calculator, care, eHealth, health
Procedia PDF Downloads 26435076 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence
Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno
Abstract:
Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index
Procedia PDF Downloads 16835075 Optimal Hedging of a Portfolio of European Options in an Extended Binomial Model under Proportional Transaction Costs
Authors: Norm Josephy, Lucy Kimball, Victoria Steblovskaya
Abstract:
Hedging of a portfolio of European options under proportional transaction costs is considered. Our discrete time financial market model extends the binomial market model with transaction costs to the case where the underlying stock price ratios are distributed over a bounded interval rather than over a two-point set. An optimal hedging strategy is chosen from a set of admissible non-self-financing hedging strategies. Our approach to optimal hedging of a portfolio of options is based on theoretical foundation that includes determination of a no-arbitrage option price interval as well as on properties of the non-self-financing strategies and their residuals. A computational algorithm for optimizing an investor relevant criterion over the set of admissible non-self-financing hedging strategies is developed. Applicability of our approach is demonstrated using both simulated data and real market data.Keywords: extended binomial model, non-self-financing hedging, optimization, proportional transaction costs
Procedia PDF Downloads 25235074 Agro Morphological Characterization of Vicia Faba L. Accessions in the Kingdom of Saudi Arabia
Authors: Zia Amjad, Salem S. Alghamdi
Abstract:
This experiment was carried out at student educational farm College of Food and Agriculture, KSU, kingdom of Saudi Arabia; in order to characterize 154 V. faba accessions based on UPOV and IBPGR descriptors. 24 agro-morphological characters including 11 quantitative and 13 qualitative were observed for genetic variation. All the results were analyzed using multivariate analysis i.e. principle component analysis (PCA). First six principle components (PC) had Eigen-value greater than one; accounted for 72% of available V. faba genetic diversity. However first three components revealed more than 10% of genetic diversity each i.e. 22.36%, 15.86% and 10.89% respectively. PCA distributed the V. faba accessions into different groups based on their performance for the characters under observation. PC-1 which represented 22.36% of the genetic diversity was positively associated with stipule spot pigmentation, intensity of streaks, pod degree of curvature and to some extent with 100 seed weight. PC-2 covered 15.86 of the genetic diversity and showed positive association for average seed weight per plant, pod length, number of seeds per plant, 100 seed weight, stipule spot pigmentation, intensity of streaks (same as in PC-1) and to some extent for pod degree of curvature and number of pods per plant. PC-3 revealed 10.89% of genetic diversity and expressed positive association for number of pods per plant and number of leaflets per plant.Keywords: agro morphological characterization, diversity, vicia faba, PCA
Procedia PDF Downloads 11435073 A Unique Exact Approach to Handle a Time-Delayed State-Space System: The Extraction of Juice Process
Authors: Mohamed T. Faheem Saidahmed, Ahmed M. Attiya Ibrahim, Basma GH. Elkilany
Abstract:
This paper discusses the application of Time Delay Control (TDC) compensation technique in the juice extraction process in a sugar mill. The objective is to improve the control performance of the process and increase extraction efficiency. The paper presents the mathematical model of the juice extraction process and the design of the TDC compensation controller. Simulation results show that the TDC compensation technique can effectively suppress the time delay effect in the process and improve control performance. The extraction efficiency is also significantly increased with the application of the TDC compensation technique. The proposed approach provides a practical solution for improving the juice extraction process in sugar mills using MATLAB Processes.Keywords: time delay control (TDC), exact and unique state space model, delay compensation, Smith predictor.
Procedia PDF Downloads 9235072 Femicide: The Political and Social Blind Spot in the Legal and Welfare State of Germany
Authors: Kristina F. Wolff
Abstract:
Background: In the Federal Republic of Germany, violence against women is deeply embedded in society. Germany is, as of March 2020, the most populous member state of the European Union with 83.2 million inhabitants and, although more than half of its inhabitants are women, gender equality was not certified in the Basic Law until 1957. Women have only been allowed to enter paid employment without their husband's consent since 1977 and have marital rape prosecuted only since 1997. While the lack of equality between men and women is named in the preamble of the Istanbul Convention as the cause of gender-specific, structural, traditional violence against women, Germany continues to sink on the latest Gender Equality Index. According to Police Crime Statistics (PCS), women are significantly more often victims of lethal violence, emanating from men than vice versa. The PCS, which, since 2015, also collects gender-specific data on violent crimes, is kept by the Federal Criminal Police Office, but without taking into account the relevant criteria for targeted prevention, such as the history of violence of the perpetrator/killer, weapon, motivation, etc.. Institutions such as EIGE or the World Health Organization have been asking Germany for years in vain for comparable data on violence against women in order to gain an overview or to develop cross-border synergies. The PCS are the only official data collection on violence against women. All players involved are depend on this data set, which is published only in November of the following year and is thus already completely outdated at the time of publication. In order to combat German femicides causally, purposefully and efficiently, evidence-based data was urgently needed. Methodology: Beginning in January 2019, a database was set up that now tracks more than 600 German femicides, broken down by more than 100 crime-related individual criteria, which in turn go far beyond the official PCS. These data are evaluated on the one hand by daily media research, and on the other hand by case-specific inquiries at the respective public prosecutor's offices and courts nationwide. This quantitative long-term study covers domestic violence as well as a variety of different types of gender-specific, lethal violence, including, for example, femicides committed by German citizens abroad. Additionallyalcohol/ narcotic and/or drug abuse, infanticides and the gender aspect in the judiciary are also considered. Results: Since November 2020, evidence-based data from a scientific survey have been available for the first time in Germany, supplementing the rudimentary picture of reality provided by PCS with a number of relevant parameters. The most important goal of the study is to identify "red flags" that enable general preventive awareness, that serve increasingly precise hazard assessment in acute hazard situations, and from which concrete instructions for action can be identified. Already at a very early stage of the study it could be proven that in more than half of all femicides with a sexual perpetrator/victim constellation there was an age difference of five years or more. Summary: Without reliable data and an understanding of the nature and extent, cause and effect, it is impossible to sustainably curb violence against girls and women, which increasingly often culminates in femicide. In Germany, valid data from a scientific survey has been available for the first time since November 2020, supplementing the rudimentary reality picture of the official and, to date, sole crime statistics with several relevant parameters. The basic research provides insights into geo-concentration, monthly peaks and the modus operandi of male violent excesses. A significant increase of child homicides in the course of femicides and/or child homicides as an instrument of violence against the mother could be proven as well as a danger of affected persons due to an age difference of five years and more. In view of the steadily increasing wave of violence against women, these study results are an eminent contribution to the preventive containment of German femicides.Keywords: femicide, violence against women, gender specific data, rule Of law, Istanbul convention, gender equality, gender based violence
Procedia PDF Downloads 8935071 An Unified Model for Longshore Sediment Transport Rate Estimation
Authors: Aleksandra Dudkowska, Gabriela Gic-Grusza
Abstract:
Wind wave-induced sediment transport is an important multidimensional and multiscale dynamic process affecting coastal seabed changes and coastline evolution. The knowledge about sediment transport rate is important to solve many environmental and geotechnical issues. There are many types of sediment transport models but none of them is widely accepted. It is bacause the process is not fully defined. Another problem is a lack of sufficient measurment data to verify proposed hypothesis. There are different types of models for longshore sediment transport (LST, which is discussed in this work) and cross-shore transport which is related to different time and space scales of the processes. There are models describing bed-load transport (discussed in this work), suspended and total sediment transport. LST models use among the others the information about (i) the flow velocity near the bottom, which in case of wave-currents interaction in coastal zone is a separate problem (ii) critical bed shear stress that strongly depends on the type of sediment and complicates in the case of heterogeneous sediment. Moreover, LST rate is strongly dependant on the local environmental conditions. To organize existing knowledge a series of sediment transport models intercomparisons was carried out as a part of the project “Development of a predictive model of morphodynamic changes in the coastal zone”. Four classical one-grid-point models were studied and intercompared over wide range of bottom shear stress conditions, corresponding with wind-waves conditions appropriate for coastal zone in polish marine areas. The set of models comprises classical theories that assume simplified influence of turbulence on the sediment transport (Du Boys, Meyer-Peter & Muller, Ribberink, Engelund & Hansen). It turned out that the values of estimated longshore instantaneous mass sediment transport are in general in agreement with earlier studies and measurements conducted in the area of interest. However, none of the formulas really stands out from the rest as being particularly suitable for the test location over the whole analyzed flow velocity range. Therefore, based on the models discussed a new unified formula for longshore sediment transport rate estimation is introduced, which constitutes the main original result of this study. Sediment transport rate is calculated based on the bed shear stress and critical bed shear stress. The dependence of environmental conditions is expressed by one coefficient (in a form of constant or function) thus the model presented can be quite easily adjusted to the local conditions. The discussion of the importance of each model parameter for specific velocity ranges is carried out. Moreover, it is shown that the value of near-bottom flow velocity is the main determinant of longshore bed-load in storm conditions. Thus, the accuracy of the results depends less on the sediment transport model itself and more on the appropriate modeling of the near-bottom velocities.Keywords: bedload transport, longshore sediment transport, sediment transport models, coastal zone
Procedia PDF Downloads 38735070 Factors Relating to Motivation to Change Behaviors in Individuals Who Are Overweight
Authors: Teresa Wills, Geraldine Mccarthy, Nicola Cornally
Abstract:
Background: Obesity is an emerging healthcare epidemic affecting virtually all age and socio-economic groups and is one of the most serious and prevalent diseases of the 21st century. It is a public health challenge because of its prevalence, associated costs and health effects. The increasing prevalence of obesity has created a social perception that overweight body sizes are healthy and normal. This normalization of obesity within our society and the acceptance of higher body weights have led to individuals being unaware of the reality of their weight status and gravity of this situation thus impeding recognition of obesity. Given the escalating global health problem of obesity and its co-morbidities, the need to re-appraise its management is more compelling than ever. It is widely accepted that the causes of obesity are complex and multi-factorial. Engagement of individuals in weight management programmes is difficult if they do not perceive they have a problem with their weight. Recognition of the problem is a key component of obesity management and identifying the main predictors of behaviour is key to designing health behaviour interventions. Aim: The aim of the research was to determine factors relating to motivation to change behaviours in individuals who perceive themselves to be overweight. Method: The research design was quantitative, correlational and cross-sectional. The design was guided by the Health Belief Model. Data were collected online using a multi-section and multi-item questionnaire, developed from a review of the theoretical and empirical research. A sample of 202 men and women who perceived themselves to be overweight participated in the research. Descriptive and inferential statistical analyses were employed to describe relationships between variables. Findings: Following multivariate regression analysis, perceived barriers to weight loss and perceived benefits of weight loss were significant predictors of motivation to change behaviour. The perceived barriers to weight loss which were significant were psychological barriers to weight loss (p = < 0.019) and environmental barriers to physical activity (p= < 0.032).The greatest predictor of motivation to change behaviour was the perceived benefits of weight loss (p < 0.001). Perceived susceptibility to obesity and perceived severity of obesity did not emerge as significant predictors in this model. Total variance explained by the model was 33.5%. Conclusion: Perceived barriers to weight loss and perceived benefits of weight loss are important determinants of motivation to change behaviour. These findings have important implications for health professionals to help inform their practice and for the development of intervention programmes to prevent and control obesity.Keywords: motivation to change behaviours, obesity, predictors of behavior, interventions, overweight
Procedia PDF Downloads 41435069 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency
Procedia PDF Downloads 27135068 Data Gathering and Analysis for Arabic Historical Documents
Authors: Ali Dulla
Abstract:
This paper introduces a new dataset (and the methodology used to generate it) based on a wide range of historical Arabic documents containing clean data simple and homogeneous-page layouts. The experiments are implemented on printed and handwritten documents obtained respectively from some important libraries such as Qatar Digital Library, the British Library and the Library of Congress. We have gathered and commented on 150 archival document images from different locations and time periods. It is based on different documents from the 17th-19th century. The dataset comprises differing page layouts and degradations that challenge text line segmentation methods. Ground truth is produced using the Aletheia tool by PRImA and stored in an XML representation, in the PAGE (Page Analysis and Ground truth Elements) format. The dataset presented will be easily available to researchers world-wide for research into the obstacles facing various historical Arabic documents such as geometric correction of historical Arabic documents.Keywords: dataset production, ground truth production, historical documents, arbitrary warping, geometric correction
Procedia PDF Downloads 16835067 A Quantitative Analysis of Rural to Urban Migration in Morocco
Authors: Donald Wright
Abstract:
The ultimate goal of this study is to reinvigorate the philosophical underpinnings the study of urbanization with scientific data with the goal of circumventing what seems an inevitable future clash between rural and urban populations. To that end urban infrastructure must be sustainable economically, politically and ecologically over the course of several generations as cities continue to grow with the incorporation of climate refugees. Our research will provide data concerning the projected increase in population over the coming two decades in Morocco, and the population will shift from rural areas to urban centers during that period of time. As a result, urban infrastructure will need to be adapted, developed or built to fit the demand of future internal migrations from rural to urban centers in Morocco. This paper will also examine how past experiences of internally displaced people give insight into the challenges faced by future migrants and, beyond the gathering of data, how people react to internal migration. This study employs four different sets of research tools. First, a large part of this study is archival, which involves compiling the relevant literature on the topic and its complex history. This step also includes gathering data bout migrations in Morocco from public data sources. Once the datasets are collected, the next part of the project involves populating the attribute fields and preprocessing the data to make it understandable and usable by machine learning algorithms. In tandem with the mathematical interpretation of data and projected migrations, this study benefits from a theoretical understanding of the critical apparatus existing around urban development of the 20th and 21st centuries that give us insight into past infrastructure development and the rationale behind it. Once the data is ready to be analyzed, different machine learning algorithms will be experimented (k-clustering, support vector regression, random forest analysis) and the results compared for visualization of the data. The final computational part of this study involves analyzing the data and determining what we can learn from it. This paper helps us to understand future trends of population movements within and between regions of North Africa, which will have an impact on various sectors such as urban development, food distribution and water purification, not to mention the creation of public policy in the countries of this region. One of the strengths of this project is the multi-pronged and cross-disciplinary methodology to the research question, which enables an interchange of knowledge and experiences to facilitate innovative solutions to this complex problem. Multiple and diverse intersecting viewpoints allow an exchange of methodological models that provide fresh and informed interpretations of otherwise objective data.Keywords: climate change, machine learning, migration, Morocco, urban development
Procedia PDF Downloads 15035066 Using Implicit Data to Improve E-Learning Systems
Authors: Slah Alsaleh
Abstract:
In the recent years and with popularity of internet and technology, e-learning became a major part of majority of education systems. One of the advantages the e-learning systems provide is the large amount of information available about the students' behavior while communicating with the e-learning system. Such information is very rich and it can be used to improve the capability and efficiency of e-learning systems. This paper discusses how e-learning can benefit from implicit data in different ways including; creating homogeneous groups of student, evaluating students' learning, creating behavior profiles for students and identifying the students through their behaviors.Keywords: e-learning, implicit data, user behavior, data mining
Procedia PDF Downloads 30935065 Enabling Quantitative Urban Sustainability Assessment with Big Data
Authors: Changfeng Fu
Abstract:
Sustainable urban development has been widely accepted a common sense in the modern urban planning and design. However, the measurement and assessment of urban sustainability, especially the quantitative assessment have been always an issue obsessing planning and design professionals. This paper will present an on-going research on the principles and technologies to develop a quantitative urban sustainability assessment principles and techniques which aim to integrate indicators, geospatial and geo-reference data, and assessment techniques together into a mechanism. It is based on the principles and techniques of geospatial analysis with GIS and statistical analysis methods. The decision-making technologies and methods such as AHP and SMART are also adopted to address overall assessment conclusions. The possible interfaces and presentation of data and quantitative assessment results are also described. This research is based on the knowledge, situations and data sources of UK, but it is potentially adaptable to other countries or regions. The implementation potentials of the mechanism are also discussed.Keywords: urban sustainability assessment, quantitative analysis, sustainability indicator, geospatial data, big data
Procedia PDF Downloads 35735064 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products
Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola
Abstract:
The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.Keywords: decision making, design euristics, product design, product design process, design paradigms
Procedia PDF Downloads 11935063 Coastal Modelling Studies for Jumeirah First Beach Stabilization
Authors: Zongyan Yang, Gagan K. Jena, Sankar B. Karanam, Noora M. A. Hokal
Abstract:
Jumeirah First beach, a segment of coastline of length 1.5 km, is one of the popular public beaches in Dubai, UAE. The stability of the beach has been affected by several coastal developmental projects, including The World, Island 2 and La Mer. A comprehensive stabilization scheme comprising of two composite groynes (of lengths 90 m and 125m), modification to the northern breakwater of Jumeirah Fishing Harbour and beach re-nourishment was implemented by Dubai Municipality in 2012. However, the performance of the implemented stabilization scheme has been compromised by La Mer project (built in 2016), which modified the wave climate at the Jumeirah First beach. The objective of the coastal modelling studies is to establish design basis for further beach stabilization scheme(s). Comprehensive coastal modelling studies had been conducted to establish the nearshore wave climate, equilibrium beach orientations and stable beach plan forms. Based on the outcomes of the modeling studies, recommendation had been made to extend the composite groynes to stabilize the Jumeirah First beach. Wave transformation was performed following an interpolation approach with wave transformation matrixes derived from simulations of a possible range of wave conditions in the region. The Dubai coastal wave model is developed with MIKE21 SW. The offshore wave conditions were determined from PERGOS wave data at 4 offshore locations with consideration of the spatial variation. The lateral boundary conditions corresponding to the offshore conditions, at Dubai/Abu Dhabi and Dubai Sharjah borders, were derived with application of LitDrift 1D wave transformation module. The Dubai coastal wave model was calibrated with wave records at monitoring stations operated by Dubai Municipality. The wave transformation matrix approach was validated with nearshore wave measurement at a Dubai Municipality monitoring station in the vicinity of the Jumeirah First beach. One typical year wave time series was transformed to 7 locations in front of the beach to count for the variation of wave conditions which are affected by adjacent and offshore developments. Equilibrium beach orientations were estimated with application of LitDrift by finding the beach orientations with null annual littoral transport at the 7 selected locations. The littoral transport calculation results were compared with beach erosion/accretion quantities estimated from the beach monitoring program (twice a year including bathymetric and topographical surveys). An innovative integral method was developed to outline the stable beach plan forms from the estimated equilibrium beach orientations, with predetermined minimum beach width. The optimal lengths for the composite groyne extensions were recommended based on the stable beach plan forms.Keywords: composite groyne, equilibrium beach orientation, stable beach plan form, wave transformation matrix
Procedia PDF Downloads 26335062 A 15 Minute-Based Approach for Berth Allocation and Quay Crane Assignment
Authors: Hoi-Lam Ma, Sai-Ho Chung
Abstract:
In traditional integrated berth allocation with quay crane assignment models, time dimension is usually assumed in hourly based. However, nowadays, transshipment becomes the main business to many container terminals, especially in Southeast Asia (e.g. Hong Kong and Singapore). In these terminals, vessel arrivals are usually very frequent with small handling volume and very short staying time. Therefore, the traditional hourly-based modeling approach may cause significant berth and quay crane idling, and consequently cannot meet their practical needs. In this connection, a 15-minute-based modeling approach is requested by industrial practitioners. Accordingly, a Three-level Genetic Algorithm (3LGA) with Quay Crane (QC) shifting heuristics is designed to fulfill the research gap. The objective function here is to minimize the total service time. Preliminary numerical results show that the proposed 15-minute-based approach can reduce the berth and QC idling significantly.Keywords: transshipment, integrated berth allocation, variable-in-time quay crane assignment, quay crane assignment
Procedia PDF Downloads 16935061 Poverty Dynamics in Thailand: Evidence from Household Panel Data
Authors: Nattabhorn Leamcharaskul
Abstract:
This study aims to examine determining factors of the dynamics of poverty in Thailand by using panel data of 3,567 households in 2007-2017. Four techniques of estimation are employed to analyze the situation of poverty across households and time periods: the multinomial logit model, the sequential logit model, the quantile regression model, and the difference in difference model. Households are categorized based on their experiences into 5 groups, namely chronically poor, falling into poverty, re-entering into poverty, exiting from poverty and never poor households. Estimation results emphasize the effects of demographic and socioeconomic factors as well as unexpected events on the economic status of a household. It is found that remittances have positive impact on household’s economic status in that they are likely to lower the probability of falling into poverty or trapping in poverty while they tend to increase the probability of exiting from poverty. In addition, not only receiving a secondary source of household income can raise the probability of being a never poor household, but it also significantly increases household income per capita of the chronically poor and falling into poverty households. Public work programs are recommended as an important tool to relieve household financial burden and uncertainty and thus consequently increase a chance for households to escape from poverty.Keywords: difference in difference, dynamic, multinomial logit model, panel data, poverty, quantile regression, remittance, sequential logit model, Thailand, transfer
Procedia PDF Downloads 11235060 Factors Affecting Employee’s Effectiveness at Job in Banking Sectors of Pakistan
Authors: Sajid Aman
Abstract:
Jobs in the banking sector in Pakistan are perceived as very tough, due to which employee turnover is very high. However, the managerial role is very important in influencing employees’ attitudes toward their turnout. This paper explores the manager’s role in influencing employees’ effectiveness on the job. The paper adopted a pragmatic approach by combining both qualitative and quantitative data. The study employed an exploratory sequential strategy under a mixed-method research design. Qualitative data was analyzed using thematic analysis. Five major themes, such as the manager’s attitude towards employees, his leadership style, listening to employee’s personal problems, provision of personal loans without interest and future career prospects, emerged as key factors increasing employee’s effectiveness in the banking sector. The quantitative data revealed that a manager’s attitude, leadership style, availability to listen to employees’ personal problems, and future career prospects and listening to employee’s personal problems are strongly associated with employees’ effectiveness at the job. However, personal loan without interest was noted as having no significant association with employee’s effectiveness at the job. The study concludes manager’s role is more important in the effectiveness of the employees at their job in the banking sector. It is suggested that managers should have a positive attitude towards employees and give time to listening to employee’s problems, even personal ones.Keywords: banking sector, employee’s effectiveness, manager’s role, leadership style
Procedia PDF Downloads 3235059 The Experimental Study on Reducing and Carbonizing Titanium-Containing Slag by Iron-Containing Coke
Authors: Yadong Liu
Abstract:
The experimental study on reduction carbonization of coke containing iron respectively with the particle size of <0.3mm, 0.3-0.6mm and 0.6-0.9mm and synthetic sea sand ore smelting reduction titanium-bearing slag as material were studied under the conditions of holding 6h at most at 1500℃. The effects of coke containing iron particle size and heat preservation time on the formation of TiC and the size of TiC crystal were studied by XRD, SEM and EDS. The results show that it is not good for the formation, concentration and growth of TiC crystal when the particle size of coke containing iron is too small or too large. The suitable particle size is 0.3~0.6mm. The heat preservation time of 2h basically ensures that all the component TiO2 in the slag are reduced and carbonized and converted to TiC. The size of TiC crystal will increase with the prolongation of heat preservation time. The thickness of the TiC layer can reach 20μm when the heat preservation time is 6h.Keywords: coke containing iron, formation and concentration and growth of TiC, reduction and carbonization, titanium-bearing slag
Procedia PDF Downloads 149