Search results for: price points
2787 Qualitative Study Method on Case Assignment Adopted by Singapore Medical Social Workers
Authors: Joleen L. H. Lee, K. F. Yen, Janette W. P. Ng, D. Woon, Mandy M. Y. Lau, Ivan M. H. Woo, S. N. Goh
Abstract:
Case assignment systems are created to meet a need for equity in work distribution and better match between medical social workers' (MSWs) competencies and patients' problems. However, there is no known study that has explored how MSWs in Singapore assign cases to achieve equity in work distribution. Focus group discussions were conducted with MSWs from public hospitals to understand their perception on equitable workload and case allocation. Three approaches to case allocation were found. First is the point system where points are allocated to cases based on a checklist of presenting issues identified most of the time by non-MSWs. Intensity of case is taken into consideration, but allocation of points is often subject to variation in appreciation of roles of MSWs by the source of referral. Second is the round robin system, where all MSWs are allocated cases based on a roster. This approach resulted in perceived equity due to element of luck, but it does not match case complexity with competencies of MSWs. Third approach is unit-based allocation, where MSWs are assigned to attend to cases from specific unit. This approach helps facilitate specialization among MSWs but may result in MSWs having difficulty providing transdisciplinary care due to narrow set of knowledge and skills. Trade-offs resulted across existing approaches for case allocation by MSWs. Conversations are needed among Singapore MSWs to decide on a case allocation system that comes with trade-offs that are acceptable for patients and other key stakeholders of the care delivery system.Keywords: case allocation, equity, medical social worker, work distribution
Procedia PDF Downloads 1252786 Net Neutrality and Asymmetric Platform Competition
Authors: Romain Lestage, Marc Bourreau
Abstract:
In this paper we analyze the interplay between access to the last-mile network and net neutrality in the market for Internet access. We consider two Internet Service Providers (ISPs), which act as platforms between Internet users and Content Providers (CPs). One of the ISPs is vertically integrated and provides access to its last-mile network to the other (non-integrated) ISP. We show that a lower access price increases the integrated ISP's incentives to charge CPs positive termination fees (i.e., to deviate from net neutrality), and decreases the non-integrated ISP's incentives to charge positive termination fees.Keywords: net neutrality, access regulation, internet access, two-sided markets
Procedia PDF Downloads 3762785 Frictional Effects on the Dynamics of a Truncated Double-Cone Gravitational Motor
Authors: Barenten Suciu
Abstract:
In this work, effects of the friction and truncation on the dynamics of a double-cone gravitational motor, self-propelled on a straight V-shaped horizontal rail, are evaluated. Such mechanism has a variable radius of contact, and, on one hand, it is similar to a pulley mechanism that changes the potential energy into the kinetic energy of rotation, but on the other hand, it is similar to a pendulum mechanism that converts the potential energy of the suspended body into the kinetic energy of translation along a circular path. Movies of the self- propelled double-cones, made of S45C carbon steel and wood, along rails made of aluminum alloy, were shot for various opening angles of the rails. Kinematical features of the double-cones were estimated through the slow-motion processing of the recorded movies. Then, a kinematical model is derived under assumption that the distance traveled by the contact points on the rectilinear rails is identical with the distance traveled by the contact points on the truncated conical surface. Additionally, a dynamic model, for this particular contact problem, was proposed and validated against the experimental results. Based on such model, the traction force and the traction torque acting on the double-cone are identified. One proved that the rolling traction force is always smaller than the sliding friction force; i.e., the double-cone is rolling without slipping. Results obtained in this work can be used to achieve the proper design of such gravitational motor.Keywords: Truncated double-cone, friction, rolling and sliding, dynamic model, gravitational motor
Procedia PDF Downloads 2752784 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes
Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono
Abstract:
Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is a widely used approach for LV segmentation but suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is proposed to improve the accuracy and speed of the model-based segmentation. Firstly, a robust and efficient detector based on Hough forest is proposed to localize cardiac feature points, and such points are used to predict the initial fitting of the LV shape model. Secondly, to achieve more accurate and detailed segmentation, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. The performance of the proposed method is evaluated on a dataset of 800 cardiac ultrasound images that are mostly of abnormal shapes. The proposed method is compared to several combinations of ASM and existing initialization methods. The experiment results demonstrate that the accuracy of feature point detection for initialization was improved by 40% compared to the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops, thus speeding up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.Keywords: hough forest, active shape model, segmentation, cardiac left ventricle
Procedia PDF Downloads 3402783 Applying Kinect on the Development of a Customized 3D Mannequin
Authors: Shih-Wen Hsiao, Rong-Qi Chen
Abstract:
In the field of fashion design, 3D Mannequin is a kind of assisting tool which could rapidly realize the design concepts. While the concept of 3D Mannequin is applied to the computer added fashion design, it will connect with the development and the application of design platform and system. Thus, the situation mentioned above revealed a truth that it is very critical to develop a module of 3D Mannequin which would correspond with the necessity of fashion design. This research proposes a concrete plan that developing and constructing a system of 3D Mannequin with Kinect. In the content, ergonomic measurements of objective human features could be attained real-time through the implement with depth camera of Kinect, and then the mesh morphing can be implemented through transformed the locations of the control-points on the model by inputting those ergonomic data to get an exclusive 3D mannequin model. In the proposed methodology, after the scanned points from the Kinect are revised for accuracy and smoothening, a complete human feature would be reconstructed by the ICP algorithm with the method of image processing. Also, the objective human feature could be recognized to analyze and get real measurements. Furthermore, the data of ergonomic measurements could be applied to shape morphing for the division of 3D Mannequin reconstructed by feature curves. Due to a standardized and customer-oriented 3D Mannequin would be generated by the implement of subdivision, the research could be applied to the fashion design or the presentation and display of 3D virtual clothes. In order to examine the practicality of research structure, a system of 3D Mannequin would be constructed with JAVA program in this study. Through the revision of experiments the practicability-contained research result would come out.Keywords: 3D mannequin, kinect scanner, interactive closest point, shape morphing, subdivision
Procedia PDF Downloads 3062782 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 4692781 Attitude towards Doping of High-Performance Athletes in a Sports Institute of the City of Medellin, Colombia
Authors: Yuban Sebastian Cuartas-Agudelo, Sandra Marcela López-Hincapié, Vivianna Alexandra Garrido-Altamar, María de los Ángeles Rodríguez-Gázquez, Camilo Ruiz-Mejía, Lina María Martínez-Sánchez, Gloria Inés Martínez-Domínguez, Luis Eduardo Contreras, Felipe Eduardo Marino-Isaza
Abstract:
Introduction: Doping is a prohibited practice in competitive sports with potential adverse effects; therefore, it is crucial to describe the attitudes of athletes towards this behavior and to determine which o these increase the susceptibility to carry out this practice. Objective: To determine the attitude of high-performance athletes towards doping in a sports institute in the city of Medellin, Colombia. Methods: We performed a cross-sectional study during 2016, with a sample taken to convenience consisting of athletes over 18 years old enrolled in a sports institute of the city of Medellin (Colombia). The athletes filled by themselves the Petroczi and Aidman questionnaire: Performance Enhancement Attitude Scale (PEAS) adapted to the Spanish language by Morente-Sánchez et al. This scale has 17 items with likert answer options, with a score ranging from 1 to 6, with a higher score indicating a stronger tendency towards doping practices. Results: 112 athletes were included with an average age of 21.6 years old, a 60% of them were male and the most frequent sports were karate 17%, judo 12.5% and athletics 9.8%. The average score of the questionnaire was 35.5 points of a 102 possible points. The lowest score was obtained in the following items: Is Doping necessary 1,4 and Doping isn’t cheating, everyone does it 1,5. Conclusion: In our population, there is a low tendency towards doping practices.Keywords: sports, doping in sports, athletic performance, attitude
Procedia PDF Downloads 2302780 Overview of Risk Management in Electricity Markets Using Financial Derivatives
Authors: Aparna Viswanath
Abstract:
Electricity spot prices are highly volatile under optimal generation capacity scenarios due to factors such as non-storability of electricity, peak demand at certain periods, generator outages, fuel uncertainty for renewable energy generators, huge investments and time needed for generation capacity expansion etc. As a result market participants are exposed to price and volume risk, which has led to the development of risk management practices. This paper provides an overview of risk management practices by market participants in electricity markets using financial derivatives.Keywords: financial derivatives, forward, futures, options, risk management
Procedia PDF Downloads 4792779 Development of Standard Thai Appetizer in Rattanakosin Era‘s Standard: Case Study of Thai Steamed Dumpling
Authors: Nunyong Fuengkajornfung, Pattama Hirunyophat, Tidarat Sanphom
Abstract:
The objectives of this research were: To study of the recipe standard of Thai steamed dumpling, to study the ratio of modified starch in Thai steamed dumpling, to study chemical elements analyzing and Escherichia coli in Thai steamed dumpling. The experimental processes were designed in two stages as follows: To study the recipe standard of Thai steamed dumpling and to study the ratio of rice flour: modify starch by three levels 90:10, 73:30, and 50:50. The evaluation test used 9 Points Hedonic Scale method by the sensory evaluation test such as color, smell, taste, texture and overall liking. An experimental by Randomized Complete Block Design (RCBD). The statistics used in data analyses were means, standard deviation, one-way ANOVA and Duncan’s New Multiple Range Test. Regression equation, at a statistically significant level of .05. The results showed that the recipe standard was studied from three recipes by the sensory evaluation test such as color, odor, taste, spicy, texture and total acceptance. The result showed that the recipe standard of second was suitably to development. The ratio of rice flour: modified starch had 3 levels 90:10, 73:30, and 50:50 which the process condition of 50:50 had well scores (like moderately to like very much; used 9 Points Hedonic Scale method for the sensory test). Chemical elements analyzing, it showed that moisture 58.63%, fat 5.45%, protein 4.35%, carbohydrate 30.45%, and Ash 1.12%. The Escherichia coli is not found in lab testing.Keywords: Thai snack in Rattanakosin era, Thai steamed dumpling, modify starch, recipe standard
Procedia PDF Downloads 3242778 Energy Consumption, Population and Economic Development Dynamics in Nigeria: An Empirical Evidence
Authors: Evelyn Nwamaka Ogbeide-Osaretin, Bright Orhewere
Abstract:
This study examined the role of the population in the linkage between energy consumption and economic development in Nigeria. Time series data on energy consumption, population, and economic development were used for the period 1995 to 2020. The Autoregressive Distributed Lag -Error Correction Model (ARDL-ECM) was engaged. Economic development had a negative substantial impact on energy consumption in the long run. Population growth had a positive significant effect on energy consumption. Government expenditure was also found to impact the level of energy consumption, while energy consumption is not a function of oil price in Nigeria.Keywords: dynamic analysis, energy consumption, population, economic development, Nigeria
Procedia PDF Downloads 1822777 Topographic Coast Monitoring Using UAV Photogrammetry: A Case Study in Port of Veracruz Expansion Project
Authors: Francisco Liaño-Carrera, Jorge Enrique Baños-Illana, Arturo Gómez-Barrero, José Isaac Ramírez-Macías, Erik Omar Paredes-JuáRez, David Salas-Monreal, Mayra Lorena Riveron-Enzastiga
Abstract:
Topographical changes in coastal areas are usually assessed with airborne LIDAR and conventional photogrammetry. In recent times Unmanned Aerial Vehicles (UAV) have been used several in photogrammetric applications including coastline evolution. However, its use goes further by using the points cloud associated to generate beach Digital Elevation Models (DEM). We present a methodology for monitoring coastal topographic changes along a 50 km coastline in Veracruz, Mexico using high-resolution images (less than 10 cm ground resolution) and dense points cloud captured with an UAV. This monitoring develops in the context of the port of Veracruz expansion project which construction began in 2015 and intends to characterize coast evolution and prevent and mitigate project impacts on coastal environments. The monitoring began with a historical coastline reconstruction since 1979 to 2015 using aerial photography and Landsat imagery. We could define some patterns: the northern part of the study area showed accretion while the southern part of the study area showed erosion. Since the study area is located off the port of Veracruz, a touristic and economical Mexican urban city, where coastal development structures have been built since 1979 in a continuous way, the local beaches of the touristic area are been refilled constantly. Those areas were not described as accretion since every month sand-filled trucks refill the sand beaches located in front of the hotel area. The construction of marinas and the comitial port of Veracruz, the old and the new expansion were made in the erosion part of the area. Northward from the City of Veracruz the beaches were described as accretion areas while southward from the city, the beaches were described as erosion areas. One of the problems is the expansion of the new development in the southern area of the city using the beach view as an incentive to buy front beach houses. We assessed coastal changes between seasons using high-resolution images and also points clouds during 2016 and preliminary results confirm that UAVs can be used in permanent coast monitoring programs with excellent performance and detail.Keywords: digital elevation model, high-resolution images, topographic coast monitoring, unmanned aerial vehicle
Procedia PDF Downloads 2702776 Detection of Trends and Break Points in Climatic Indices: The Case of Umbria Region in Italy
Authors: A. Flammini, R. Morbidelli, C. Saltalippi
Abstract:
The increase of air surface temperature at global scale is a fact, with values around 0.85 ºC since the late nineteen century, as well as a significant change in main features of rainfall regime. Nevertheless, the detected climatic changes are not equally distributed all over the world, but exhibit specific characteristics in different regions. Therefore, studying the evolution of climatic indices in different geographical areas with a prefixed standard approach becomes very useful in order to analyze the existence of climatic trend and compare results. In this work, a methodology to investigate the climatic change and its effects on a wide set of climatic indices is proposed and applied at regional scale in the case study of a Mediterranean area, Umbria region in Italy. From data of the available temperature stations, nine temperature indices have been obtained and the existence of trends has been checked by applying the non-parametric Mann-Kendall test, while the non-parametric Pettitt test and the parametric Standard Normal Homogeneity Test (SNHT) have been applied to detect the presence of break points. In addition, aimed to characterize the rainfall regime, data from 11 rainfall stations have been used and a trend analysis has been performed on cumulative annual rainfall depth, daily rainfall, rainy days, and dry periods length. The results show a general increase in any temperature indices, even if with a trend pattern dependent of indices and stations, and a general decrease of cumulative annual rainfall and average daily rainfall, with a time rainfall distribution over the year different from the past.Keywords: climatic change, temperature, rainfall regime, trend analysis
Procedia PDF Downloads 1202775 A Review on the Re-Usage of Single-Use Medical Devices
Authors: Lucas B. Naves, Maria José Abreu
Abstract:
Reprocessing single-use device has attracted interesting on the medical environment over the last decades. The reprocessing technique was sought in order to reduce the cost of purchasing the new medical device, which can achieve almost double of the price of the reprocessed product. In this manuscript, we have done a literature review, aiming the reuse of medical device that was firstly designed for single use only, but has become, more and more, effective on its reprocessing procedure. We also show the regulation, the countries which allows this procedure, the classification of these device and also the most important issue concerning the re-utilization of medical device, how to minimizing the risk of gram positive and negative bacteria, avoid cross-contamination, hepatitis B (HBV), and C (HCV) virus, and also human immunodeficiency virus (HIV).Keywords: reusing, reprocessing, single-use medical device, HIV, hepatitis B and C
Procedia PDF Downloads 3922774 The Use of the Limit Cycles of Dynamic Systems for Formation of Program Trajectories of Points Feet of the Anthropomorphous Robot
Authors: A. S. Gorobtsov, A. S. Polyanina, A. E. Andreev
Abstract:
The movement of points feet of the anthropomorphous robot in space occurs along some stable trajectory of a known form. A large number of modifications to the methods of control of biped robots indicate the fundamental complexity of the problem of stability of the program trajectory and, consequently, the stability of the control for the deviation for this trajectory. Existing gait generators use piecewise interpolation of program trajectories. This leads to jumps in the acceleration at the boundaries of sites. Another interpolation can be realized using differential equations with fractional derivatives. In work, the approach to synthesis of generators of program trajectories is considered. The resulting system of nonlinear differential equations describes a smooth trajectory of movement having rectilinear sites. The method is based on the theory of an asymptotic stability of invariant sets. The stability of such systems in the area of localization of oscillatory processes is investigated. The boundary of the area is a bounded closed surface. In the corresponding subspaces of the oscillatory circuits, the resulting stable limit cycles are curves having rectilinear sites. The solution of the problem is carried out by means of synthesis of a set of the continuous smooth controls with feedback. The necessary geometry of closed trajectories of movement is obtained due to the introduction of high-order nonlinearities in the control of stabilization systems. The offered method was used for the generation of trajectories of movement of point’s feet of the anthropomorphous robot. The synthesis of the robot's program movement was carried out by means of the inverse method.Keywords: control, limits cycle, robot, stability
Procedia PDF Downloads 3312773 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 692772 Analysis of Trends and Challenges of Using Renewable Biomass for Bioplastics
Authors: Namasivayam Navaranjan, Eric Dimla
Abstract:
The world needs more quality food, shelter and transportation to meet the demands of growing population and improving living standard of those who currently live below the poverty line. Materials are essential commodities for various applications including food and pharmaceutical packaging, building and automobile. Petroleum based plastics are widely used materials amongst others for these applications and their demand is expected to increase. Use of plastics has environment related issues because considerable amount of plastic used worldwide is disposed in landfills, where its resources are wasted, the material takes up valuable space and blights communities. Some countries have been implementing regulations and/or legislations to increase reuse, recycle, renew and remanufacture materials as well as to minimise the use of non-environmentally friendly materials such as petroleum plastics. However, issue of material waste is still a concern in the countries who have low environmental regulations. Development of materials, mostly bioplastics from renewable biomass resources has become popular in the last decade. It is widely believed that the potential for up to 90% substitution of total plastics consumption by bioplastics is technically possible. The global demand for bioplastics is estimated to be approximately six times larger than in 2010. Recently, standard polymers like polyethylene (PE), polypropylene (PP), Polyvinyl Chloride (PVC) or Polyethylene terephthalate (PET), but also high-performance polymers such as polyamides or polyesters have been totally or partially substituted by their renewable equivalents. An example is Polylactide (PLA) being used as a substitute in films and injection moulded products made of petroleum plastics, e.g. PET. The starting raw materials for bio-based materials are usually sugars or starches that are mostly derived from food resources, partially also recycled materials from food or wood processing. The risk in lower food availability by increasing price of basic grains as a result of competition with biomass-based product sectors for feedstock also needs to be considered for the future bioplastic production. Manufacturing of bioplastic materials is often still reliant upon petroleum as an energy and materials source. Life Cycle Assessment (LCA) of bioplastic products has being conducted to determine the sustainability of a production route. However, the accuracy of LCA depends on several factors and needs improvement. Low oil price and high production cost may also limit the technically possible growth of these plastics in the coming years.Keywords: bioplastics, plastics, renewable resources, biomass
Procedia PDF Downloads 3082771 The Relationship between Violence against Women and Levels of Self-Esteem in Urban Informal Settlements of Mumbai, India: A Cross-Sectional Study
Authors: A. Bentley, A. Prost, N. Daruwalla, D. Osrin
Abstract:
Background: This study aims to investigate the relationship between experiences of violence against women in the family, and levels of self-esteem in women residing in informal settlement (slum) areas of Mumbai, India. The authors hypothesise that violence against women in Indian households extends beyond that of intimate partner violence (IPV), to include other members of the family and that experiences of violence are associated with lower levels of self-esteem. Methods: Experiences of violence were assessed through a cross-sectional survey of 598 women, including questions about specific acts of emotional, economic, physical and sexual violence across different time points, and the main perpetrator of each. Self-esteem was assessed using the Rosenberg self-esteem questionnaire. A global score for self-esteem was calculated and the relationship between violence in the past year and Rosenberg self-esteem score was assessed using multivariable linear regression models, adjusted for years of education completed, and clustering using robust standard errors. Results: 482 (81%) women consented to interview. On average, they were 28.5 years old, had completed 6 years of education and had been married 9.5 years. 88% were Muslim and 46% lived in joint families. 44% of women had experienced at least one act of violence in their lifetime (33% emotional, 22% economic, 24% physical, 12% sexual). Of the women who experienced violence after marriage, 70% cited a perpetrator other than the husband for at least one of the acts. 5% had low self-esteem (Rosenberg score < 15). For women who experienced emotional violence in the past year, the Rosenberg score was 2.6 points lower (p < 0.001). It was 1.2 points lower (p = 0.03) for women who experienced economic violence. For physical or sexual violence in the past year, no statistically significant relationship with Rosenberg score was seen. However, for a one-unit increase in the number of different acts of each type of violence experienced in the past year, a decrease in Rosenberg score was seen (-0.62 for emotional, -0.76 for economic, -0.53 for physical and -0.47 for sexual; p < 0.05 for all). Discussion: The high prevalence of violence experiences across the lifetime was likely due to the detailed assessment of violence and the inclusion of perpetrators within the family other than the husband. Experiences of emotional or economic violence in the past year were associated with lower Rosenberg scores and therefore lower self-esteem, but no relationship was seen between experiences of physical or sexual violence and Rosenberg score overall. For all types of violence in the past year, a greater number of different acts were associated with a decrease in Rosenberg score. Emotional violence showed the strongest relationship with self-esteem, but for all types of violence the more complex the pattern of perpetration with different methods used, the lower the levels of self-esteem. Due to the cross-sectional nature of the study causal directionality cannot be attributed. Further work to investigate the relationship between severity of violence and self-esteem and whether self-esteem mediates relationships between violence and poorer mental health would be beneficial.Keywords: family violence, India, informal settlements, Rosenberg self-esteem scale, self-esteem, violence against women
Procedia PDF Downloads 1262770 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 2852769 Choice Analysis of Ground Access to São Paulo/Guarulhos International Airport Using Adaptive Choice-Based Conjoint Analysis (ACBC)
Authors: Carolina Silva Ansélmo
Abstract:
Airports are demand-generating poles that affect the flow of traffic around them. The airport access system must be fast, convenient, and adequately planned, considering its potential users. An airport with good ground access conditions can provide the user with a more satisfactory access experience. When several transport options are available, service providers must understand users' preferences and the expected quality of service. The present study focuses on airport access in a comparative scenario between bus, private vehicle, subway, taxi and urban mobility transport applications to São Paulo/Guarulhos International Airport. The objectives are (i) to identify the factors that influence the choice, (ii) to measure Willingness to Pay (WTP), and (iii) to estimate the market share for each modal. The applied method was Adaptive Choice-based Conjoint Analysis (ACBC) technique using Sawtooth Software. Conjoint analysis, rooted in Utility Theory, is a survey technique that quantifies the customer's perceived utility when choosing alternatives. Assessing user preferences provides insights into their priorities for product or service attributes. An additional advantage of conjoint analysis is its requirement for a smaller sample size compared to other methods. Furthermore, ACBC provides valuable insights into consumers' preferences, willingness to pay, and market dynamics, aiding strategic decision-making to provide a better customer experience, pricing, and market segmentation. In the present research, the ACBC questionnaire had the following variables: (i) access time to the boarding point, (ii) comfort in the vehicle, (iii) number of travelers together, (iv) price, (v) supply power, and (vi) type of vehicle. The case study questionnaire reached 213 valid responses considering the scenario of access from the São Paulo city center to São Paulo/Guarulhos International Airport. As a result, the price and the number of travelers are the most relevant attributes for the sample when choosing airport access. The market share of the selection is mainly urban mobility transport applications, followed by buses, private vehicles, taxis and subways.Keywords: adaptive choice-based conjoint analysis, ground access to airport, market share, willingness to pay
Procedia PDF Downloads 782768 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 1082767 Finding Optimal Operation Condition in a Biological Nutrient Removal Process with Balancing Effluent Quality, Economic Cost and GHG Emissions
Authors: Seungchul Lee, Minjeong Kim, Iman Janghorban Esfahani, Jeong Tai Kim, ChangKyoo Yoo
Abstract:
It is hard to maintain the effluent quality of the wastewater treatment plants (WWTPs) under with fixed types of operational control because of continuously changed influent flow rate and pollutant load. The aims of this study is development of multi-loop multi-objective control (ML-MOC) strategy in plant-wide scope targeting four objectives: 1) maximization of nutrient removal efficiency, 2) minimization of operational cost, 3) maximization of CH4 production in anaerobic digestion (AD) for CH4 reuse as a heat source and energy source, and 4) minimization of N2O gas emission to cope with global warming. First, benchmark simulation mode is modified to describe N2O dynamic in biological process, namely benchmark simulation model for greenhouse gases (BSM2G). Then, three types of single-loop proportional-integral (PI) controllers for DO controller, NO3 controller, and CH4 controller are implemented. Their optimal set-points of the controllers are found by using multi-objective genetic algorithm (MOGA). Finally, multi loop-MOC in BSM2G is implemented and evaluated in BSM2G. Compared with the reference case, the ML-MOC with the optimal set-points showed best control performances than references with improved performances of 34%, 5% and 79% of effluent quality, CH4 productivity, and N2O emission respectively, with the decrease of 65% in operational cost.Keywords: Benchmark simulation model for greenhouse gas, multi-loop multi-objective controller, multi-objective genetic algorithm, wastewater treatment plant
Procedia PDF Downloads 5032766 Evaluation of Knowledge and Acceptance of Food Irradiated by Individual from Food Bank of Brazil
Authors: Juliana Altavista Sagretti Gallo, Susy Frey Sabato
Abstract:
Despite the poverty in the world, a third of all food produced in the world is wasted. FAO, the United Nations Organization of Agriculture and Food, points out the need to combine actions and new technologies to combat hunger and waste in contrast to the high production of food in the world. The energy of ionizing radiation in food brought many positive results, such as increased validity and insect infestation control. The food banks are organizations that act at various points of the food chain to collect and distribute food to the needy. So, the aim of this study was to initiate a partnership between irradiation and the food bank through the development of a questionnaire to evaluate and disseminate the knowledge and acceptance of individuals in the food bank in Brazil. Also, this study aimed to standardize a basis questionnaire for future research assessment of irradiated foods. For the construction of the questionnaire as a measuring instrument, a comprehensive and rigorous literature review was made. It's covered qualitative research, questionnaires, sensory evaluation, and food irradiated. Three stages of pre - tests were necessary, and related fields of experts were consulted. As a result, the questionnaire has three parts, personal issues, assertive issues and questions of multiple choices and finally an informative question. The questionnaire was applied in Ceagesp food bank in the biggest center of food in Brazil. Conclusions. 30 % of participants of Ceagesp bank had already heard of the Food irradiation but did not know about the mechanism, so they rejected the idea to associate with radioactivity and danger. The video showed in the last question and application of the questionnaire disseminated the idea of security. All individuals declare understand the goal of treatment and accept buy and consume irradiated food after them.Keywords: bank of food, questionary, irradiated food, acceptance of irradiated food
Procedia PDF Downloads 3232765 An Adaptive Controller Method Based on Full-State Linear Model of Variable Cycle Engine
Authors: Jia Li, Huacong Li, Xiaobao Han
Abstract:
Due to the more variable geometry parameters of VCE (variable cycle aircraft engine), presents an adaptive controller method based on the full-state linear model of VCE and has simulated to solve the multivariate controller design problem of the whole flight envelops. First, analyzes the static and dynamic performances of bypass ratio and other state parameters caused by variable geometric components, and develops nonlinear component model of VCE. Then based on the component model, through small deviation linearization of main fuel (Wf), the area of tail nozzle throat (A8) and the angle of rear bypass ejector (A163), setting up multiple linear model which variable geometric parameters can be inputs. Second, designs the adaptive controllers for VCE linear models of different nominal points. Among them, considering of modeling uncertainties and external disturbances, derives the adaptive law by lyapunov function. The simulation results showed that, the adaptive controller method based on full-state linear model used the angle of rear bypass ejector as input and effectively solved the multivariate control problems of VCE. The performance of all nominal points could track the desired closed-loop reference instructions. The adjust time was less than 1.2s, and the system overshoot was less than 1%, at the same time, the errors of steady states were less than 0.5% and the dynamic tracking errors were less than 1%. In addition, the designed controller could effectively suppress interference and reached the desired commands with different external random noise signals.Keywords: variable cycle engine (VCE), full-state linear model, adaptive control, by-pass ratio
Procedia PDF Downloads 3182764 Classifying Students for E-Learning in Information Technology Course Using ANN
Authors: Sirilak Areerachakul, Nat Ployong, Supayothin Na Songkla
Abstract:
This research’s objective is to select the model with most accurate value by using Neural Network Technique as a way to filter potential students who enroll in IT course by electronic learning at Suan Suanadha Rajabhat University. It is designed to help students selecting the appropriate courses by themselves. The result showed that the most accurate model was 100 Folds Cross-validation which had 73.58% points of accuracy.Keywords: artificial neural network, classification, students, e-learning
Procedia PDF Downloads 4262763 Iris Recognition Based on the Low Order Norms of Gradient Components
Authors: Iman A. Saad, Loay E. George
Abstract:
Iris pattern is an important biological feature of human body; it becomes very hot topic in both research and practical applications. In this paper, an algorithm is proposed for iris recognition and a simple, efficient and fast method is introduced to extract a set of discriminatory features using first order gradient operator applied on grayscale images. The gradient based features are robust, up to certain extents, against the variations may occur in contrast or brightness of iris image samples; the variations are mostly occur due lightening differences and camera changes. At first, the iris region is located, after that it is remapped to a rectangular area of size 360x60 pixels. Also, a new method is proposed for detecting eyelash and eyelid points; it depends on making image statistical analysis, to mark the eyelash and eyelid as a noise points. In order to cover the features localization (variation), the rectangular iris image is partitioned into N overlapped sub-images (blocks); then from each block a set of different average directional gradient densities values is calculated to be used as texture features vector. The applied gradient operators are taken along the horizontal, vertical and diagonal directions. The low order norms of gradient components were used to establish the feature vector. Euclidean distance based classifier was used as a matching metric for determining the degree of similarity between the features vector extracted from the tested iris image and template features vectors stored in the database. Experimental tests were performed using 2639 iris images from CASIA V4-Interival database, the attained recognition accuracy has reached up to 99.92%.Keywords: iris recognition, contrast stretching, gradient features, texture features, Euclidean metric
Procedia PDF Downloads 3352762 Advantages of Computer Navigation in Knee Arthroplasty
Authors: Mohammad Ali Al Qatawneh, Bespalchuk Pavel Ivanovich
Abstract:
Computer navigation has been introduced in total knee arthroplasty to improve the accuracy of the procedure. Computer navigation improves the accuracy of bone resection in the coronal and sagittal planes. It was also noted that it normalizes the rotational alignment of the femoral component and fully assesses and balances the deformation of soft tissues in the coronal plane. The work is devoted to the advantages of using computer navigation technology in total knee arthroplasty in 62 patients (11 men and 51 women) suffering from gonarthrosis, aged 51 to 83 years, operated using a computer navigation system, followed up to 3 years from the moment of surgery. During the examination, the deformity variant was determined, and radiometric parameters of the knee joints were measured using the Knee Society Score (KSS), Functional Knee Society Score (FKSS), and Western Ontario and McMaster University Osteoarthritis Index (WOMAC) scales. Also, functional stress tests were performed to assess the stability of the knee joint in the frontal plane and functional indicators of the range of motion. After surgery, improvement was observed in all scales; firstly, the WOMAC values decreased by 5.90 times, and the median value to 11 points (p < 0.001), secondly KSS increased by 3.91 times and reached 86 points (p < 0.001), and the third one is that FKSS data increased by 2.08 times and reached 94 points (p < 0.001). After TKA, the axis deviation of the lower limbs of more than 3 degrees was observed in 4 patients at 6.5% and frontal instability of the knee joint just in 2 cases at 3.2%., The lower incidence of sagittal instability of the knee joint after the operation was 9.6%. The range of motion increased by 1.25 times; the volume of movement averaged 125 degrees (p < 0.001). Computer navigation increases the accuracy of the spatial orientation of the endoprosthesis components in all planes, reduces the variability of the axis of the lower limbs within ± 3 °, allows you to achieve the best results of surgical interventions, and can be used to solve most basic tasks, allowing you to achieve excellent and good outcomes of operations in 100% of cases according to the WOMAC scale. With diaphyseal deformities of the femur and/or tibia, as well as with obstruction of their medullary canal, the use of computer navigation is the method of choice. The use of computer navigation prevents the occurrence of flexion contracture and hyperextension of the knee joint during the distal sawing of the femur. Using the navigation system achieves high-precision implantation for the endoprosthesis; in addition, it achieves an adequate balance of the ligaments, which contributes to the stability of the joint, reduces pain, and allows for the achievement of a good functional result of the treatment.Keywords: knee joint, arthroplasty, computer navigation, advantages
Procedia PDF Downloads 902761 Risk Tolerance in Youth With Emerging Mood Disorders
Authors: Ange Weinrabe, James Tran, Ian B. Hickie
Abstract:
Risk-taking behaviour is common during youth. In the time between adolescence and early adulthood, young people (aged 15-25 years) are more vulnerable to mood disorders, such as anxiety and depression. What impact does an emerging mood disorder have on decision-making in youth at critical decision points in their lives? In this article, we explore the impact of risk and ambiguity on youth decision-making in a clinical setting using a well-known economic experiment. At two time points, separated by six to eight weeks, we measured risky and ambiguous choices concurrently with findings from three psychological questionnaires, the 10-item Kessler Psychological Distress Scale (K10), the 17-item Quick Inventory of Depressive Symptomatology Adolescent Version (QIDS-A17), and the 12-item Somatic and Psychological Health Report (SPHERE-12), for young help seekers aged 16-25 (n=30, mean age 19.22 years, 19 males). When first arriving for care, we found that 50% (n=15) of participants experienced severe anxiety (K10 ≥ 30) and were severely depressed (QIDS-A17 ≥ 16). In Session 2, taking attrition rates into account (n=5), we found that 44% (n=11) remained severe across the full battery of questionnaires. When applying multiple regression analyses of the pooled sample of observations (N=55), across both sessions, we found that participants who rated severely anxious avoided making risky decisions. We suggest there is some statistically significant (although weak) (p=0.09) relation between risk and severe anxiety scores as measured by K10. Our findings may support working with novel tools with which to evaluate youth experiencing an emerging mood disorder and their cognitive capacities influencing decision-making.Keywords: anxiety, decision-making, risk, adolescence
Procedia PDF Downloads 1162760 Parabolic Impact Law of High Frequency Exchanges on Price Formation in Commodities Market
Authors: L. Maiza, A. Cantagrel, M. Forestier, G. Laucoin, T. Regali
Abstract:
Evaluation of High Frequency Trading (HFT) impact on financial markets is very important for traders who use market analysis to detect winning transaction opportunity. Analysis of HFT data on tobacco commodity market is discussed here and interesting linear relationship has been shown between trading frequency and difference between averaged trading prices above and below considered trading frequency. This may open new perspectives on markets data understanding and could provide possible interpretation of Adam Smith invisible hand.Keywords: financial market, high frequency trading, analysis, impacts, Adam Smith invisible hand
Procedia PDF Downloads 3592759 AAV-Mediated Human Α-Synuclein Expression in a Rat Model of Parkinson's Disease –Further Characterization of PD Phenotype, Fine Motor Functional Effects as Well as Neurochemical and Neuropathological Changes over Time
Authors: R. Pussinen, V. Jankovic, U. Herzberg, M. Cerrada-Gimenez, T. Huhtala, A. Nurmi, T. Ahtoniemi
Abstract:
Targeted over-expression of human α-synuclein using viral-vector mediated gene delivery into the substantia nigra of rats and non-human primates has been reported to lead to dopaminergic cell loss and the formation of α-synuclein aggregates reminiscent of Lewy bodies. We have previously shown how AAV-mediated expression of α-synuclein is seen in the chronic phenotype of the rats over 16 week follow-up period. In the context of these findings, we attempted to further characterize this long term PD related functional and motor deficits as well as neurochemical and neuropathological changes in AAV-mediated α-synuclein transfection model in rats during chronic follow-up period. Different titers of recombinant AAV expressing human α-synuclein (A53T) were stereotaxically injected unilaterally into substantia nigra of Wistar rats. Rats were allowed to recover for 3 weeks prior to initial baseline behavioral testing with rotational asymmetry test, stepping test and cylinder test. A similar behavioral test battery was applied again at weeks 5, 9,12 and 15. In addition to traditionally used rat PD model tests, MotoRater test system, a high speed kinematic gait performance monitoring was applied during the follow-up period. Evaluation focused on animal gait between groups. Tremor analysis was performed on weeks 9, 12 and 15. In addition to behavioral end-points, neurochemical evaluation of dopamine and its metabolites were evaluated in striatum. Furthermore, integrity of the dopamine active transport (DAT) system was evaluated by using 123I- β-CIT and SPECT/CT imaging on weeks 3, 8 and 12 after AAV- α-synuclein transfection. Histopathology was examined from end-point samples at 3 or 12 weeks after AAV- α-synuclein transfection to evaluate dopaminergic cell viability and microglial (Iba-1) activation status in substantia nigra by using stereological analysis techniques. This study focused on the characterization and validation of previously published AAV- α-synuclein transfection model in rats but with the addition of novel end-points. We present the long term phenotype of AAV- α-synuclein transfected rats with traditionally used behavioral tests but also by using novel fine motor analysis techniques and tremor analysis which provide new insight to unilateral effects of AAV α-synuclein transfection. We also present data about neurochemical and neuropathological end-points for the dopaminergic system in the model and how well they correlate with behavioral phenotype.Keywords: adeno-associated virus, alphasynuclein, animal model, Parkinson’s disease
Procedia PDF Downloads 2952758 Correlation Results Based on Magnetic Susceptibility Measurements by in-situ and Ex-Situ Measurements as Indicators of Environmental Changes Due to the Fertilizer Industry
Authors: Nurin Amalina Widityani, Adinda Syifa Azhari, Twin Aji Kusumagiani, Eleonora Agustine
Abstract:
Fertilizer industry activities contribute to environmental changes. Changes to the environment became one of a few problems in this era of globalization. Parameters that can be seen as criteria to identify changes in the environment can be seen from the aspects of physics, chemistry, and biology. One aspect that can be assessed quickly and efficiently to describe environmental change is the aspect of physics, one of which is the value of magnetic susceptibility (χ). The rock magnetism method can be used as a proxy indicator of environmental changes, seen from the value of magnetic susceptibility. The rock magnetism method is based on magnetic susceptibility studies to measure and classify the degree of pollutant elements that cause changes in the environment. This research was conducted in the area around the fertilizer plant, with five coring points on each track, each coring point a depth of 15 cm. Magnetic susceptibility measurements were performed by in-situ and ex-situ. In-situ measurements were carried out directly by using the SM30 tool by putting the tools on the soil surface at each measurement point and by that obtaining the value of the magnetic susceptibility. Meanwhile, ex-situ measurements are performed in the laboratory by using the Bartington MS2B tool’s susceptibility, which is done on a coring sample which is taken every 5 cm. In-situ measurement shows results that the value of magnetic susceptibility at the surface varies, with the lowest score on the second and fifth points with the -0.81 value and the highest value at the third point, with the score of 0,345. Ex-situ measurements can find out the variations of magnetic susceptibility values at each depth point of coring. At a depth of 0-5 cm, the value of the highest XLF = 494.8 (x10-8m³/kg) is at the third point, while the value of the lowest XLF = 187.1 (x10-8m³/kg) at first. At a depth of 6-10 cm, the highest value of the XLF was at the second point, which was 832.7 (x10-8m³/kg) while the lowest XLF is at the first point, at 211 (x10-8m³/kg). At a depth of 11-15 cm, the XLF’s highest value = 857.7 (x10-8m³/kg) is at the second point, whereas the value of the lowest XLF = 83.3 (x10-8m³/kg) is at the fifth point. Based on the in situ and exsit measurements, it can be seen that the highest magnetic susceptibility values from the surface samples are at the third point.Keywords: magnetic susceptibility, fertilizer plant, Bartington MS2B, SM30
Procedia PDF Downloads 342