Search results for: Paul Gee model
15804 Finite Element Modelling and Analysis of Human Knee Joint
Authors: R. Ranjith Kumar
Abstract:
Computer modeling and simulation of human movement is playing an important role in sports and rehabilitation. Accurate modeling and analysis of human knee join is more complex because of complicated structure whose geometry is not easily to represent by a solid model. As part of this project, from the number of CT scan images of human knee join surface reconstruction is carried out using 3D slicer software, an open source software. From this surface reconstruction model, using mesh lab (another open source software) triangular meshes are created on reconstructed surface. This final triangular mesh model is imported to Solid Works, 3D mechanical CAD modeling software. Finally this CAD model is imported to ABAQUS, finite element analysis software for analyzing the knee joints. The results obtained are encouraging and provides an accurate way of modeling and analysis of biological parts without human intervention.Keywords: solid works, CATIA, Pro-e, CAD
Procedia PDF Downloads 12315803 Logistic Regression Based Model for Predicting Students’ Academic Performance in Higher Institutions
Authors: Emmanuel Osaze Oshoiribhor, Adetokunbo MacGregor John-Otumu
Abstract:
In recent years, there has been a desire to forecast student academic achievement prior to graduation. This is to help them improve their grades, particularly for individuals with poor performance. The goal of this study is to employ supervised learning techniques to construct a predictive model for student academic achievement. Many academics have already constructed models that predict student academic achievement based on factors such as smoking, demography, culture, social media, parent educational background, parent finances, and family background, to name a few. This feature and the model employed may not have correctly classified the students in terms of their academic performance. This model is built using a logistic regression classifier with basic features such as the previous semester's course score, attendance to class, class participation, and the total number of course materials or resources the student is able to cover per semester as a prerequisite to predict if the student will perform well in future on related courses. The model outperformed other classifiers such as Naive bayes, Support vector machine (SVM), Decision Tree, Random forest, and Adaboost, returning a 96.7% accuracy. This model is available as a desktop application, allowing both instructors and students to benefit from user-friendly interfaces for predicting student academic achievement. As a result, it is recommended that both students and professors use this tool to better forecast outcomes.Keywords: artificial intelligence, ML, logistic regression, performance, prediction
Procedia PDF Downloads 9615802 A Curricular Approach to Organizational Mentoring Programs: The Integrated Mentoring Curriculum Model
Authors: Christopher Webb
Abstract:
This work presents a new model of mentoring in an organizational environment and has important implications for both practice and research, the model frames the organizational environment as organizational curriculum, which includes the elements that affect learning within the organization. This includes the organizational structure and culture, roles within the organization, and accessibility of knowledge. The program curriculum includes the elements of the mentoring program, including materials, training, and scheduled events for the program participants. The term dyadic curriculum is coined in this work. The dyadic curriculum describes the participation, behavior, and identities of the pairs participating in mentorships. This also includes the identity work of the participants and their views of each other. Much of this curriculum is unprescribed and is unique within each dyad. It describes how participants mediate the elements of organizational and program curricula. These three curricula interact and affect each other in predictable ways. A detailed example of a mentoring program framed in this model is provided.Keywords: curriculum, mentoring, organizational learning and development, social learning
Procedia PDF Downloads 20115801 Using Lean Six-Sigma in the Improvement of Service Quality at Aviation Industry: Case Study at the Departure Area in KKIA
Authors: Tareq Al Muhareb, Jasper Graham-Jones
Abstract:
The service quality is a significant element in aviation industry especially in the international airports. Through this paper, the researchers built a model based on Lean six sigma methodologies and applied it in the departure area at KKIA (King Khalid International Airport) in order to assess it. This model characterized with many special features that can become over the cultural differences in aviation industry since it is considered the most critical circumstance in this field. Applying the model of this study is depending on following the DMAIC procedure systemized in lean thinking aspects. This model of Lean-six-sigma as a managerial procedure is mostly focused on the change management culture that requires high level of planning, organizing, modifying, and controlling in order to benefit from strengths as well as revoke weaknesses.Keywords: lean-six-sigma, service quality, aviation industry, KKIA (King Khalid International Airport), SERVQUAL
Procedia PDF Downloads 42415800 Optimization of Syngas Quality for Fischer-Tropsch Synthesis
Authors: Ali Rabah
Abstract:
This research received no grant or financial support from any public, commercial, or none governmental agency. The author conducted this work as part of his normal research activities as a professor of Chemical Engineering at the University of Khartoum, Sudan. Abstract While fossil oil reserves have been receding, the demand for diesel and gasoline has been growing. In recent years, syngas of biomass origin has been emerging as a viable feedstock for Fischer-Tropsch (FT) synthesis, a process for manufacturing synthetic gasoline and diesel. This paper reports the optimization of syngas quality to match FT synthesis requirements. The optimization model maximizes the thermal efficiency under the constraint of H2/CO≥2.0 and operating conditions of equivalent ratio (0 ≤ ER ≤ 1.0), steam to biomass ratio (0 ≤ SB ≤ 5), and gasification temperature (500 °C ≤ Tg ≤ 1300 °C). The optimization model is executed using the optimization section of the Model Analysis Tools of the Aspen Plus simulator. The model is tested using eleven (11) types of MSW. The optimum operating conditions under which the objective function and the constraint are satisfied are ER=0, SB=0.66-1.22, and Tg=679 - 763°C. Under the optimum operating conditions, the syngas quality is H2=52.38 - 58.67-mole percent, LHV=12.55 - 17.15 MJ/kg, N2=0.38 - 2.33-mole percent, and H2/CO≥2.15. The generalized optimization model reported could be extended to any other type of biomass and coal. Keywords: MSW, Syngas, Optimization, Fischer-Tropsch.Keywords: syngas, MSW, optimization, Fisher-Tropsh
Procedia PDF Downloads 8015799 Extension of a Competitive Location Model Considering a Given Number of Servers and Proposing a Heuristic for Solving
Authors: Mehdi Seifbarghy, Zahra Nasiri
Abstract:
Competitive location problem deals with locating new facilities to provide a service (or goods) to the customers of a given geographical area where other facilities (competitors) offering the same service are already present. The new facilities will have to compete with the existing facilities for capturing the market share. This paper proposes a new model to maximize the market share in which customers choose the facilities based on traveling time, waiting time and attractiveness. The attractiveness of a facility is considered as a parameter in the model. A heuristic is proposed to solve the problem.Keywords: competitive location, market share, facility attractiveness, heuristic
Procedia PDF Downloads 52015798 Research on Air pollution Spatiotemporal Forecast Model Based on LSTM
Authors: JingWei Yu, Hong Yang Yu
Abstract:
At present, the increasingly serious air pollution in various cities of China has made people pay more attention to the air quality index(hereinafter referred to as AQI) of their living areas. To face this situation, it is of great significance to predict air pollution in heavily polluted areas. In this paper, based on the time series model of LSTM, a spatiotemporal prediction model of PM2.5 concentration in Mianyang, Sichuan Province, is established. The model fully considers the temporal variability and spatial distribution characteristics of PM2.5 concentration. The spatial correlation of air quality at different locations is based on the Air quality status of other nearby monitoring stations, including AQI and meteorological data to predict the air quality of a monitoring station. The experimental results show that the method has good prediction accuracy that the fitting degree with the actual measured data reaches more than 0.7, which can be applied to the modeling and prediction of the spatial and temporal distribution of regional PM2.5 concentration.Keywords: LSTM, PM2.5, neural networks, spatio-temporal prediction
Procedia PDF Downloads 13215797 Evaluation of Ceres Wheat and Rice Model for Climatic Conditions in Haryana, India
Authors: Mamta Rana, K. K. Singh, Nisha Kumari
Abstract:
The simulation models with its soil-weather-plant atmosphere interacting system are important tools for assessing the crops in changing climate conditions. The CERES-Wheat & Rice vs. 4.6 DSSAT was calibrated and evaluated for one of the major producers of wheat and rice state- Haryana, India. The simulation runs were made under irrigated conditions and three fertilizer applications dose of N-P-K to estimate crop yield and other growth parameters along with the phenological development of the crop. The genetic coefficients derived by iteratively manipulating the relevant coefficients that characterize the phenological process of wheat and rice crop to the best fit match between the simulated and observed anthesis, physological maturity and final grain yield. The model validated by plotting the simulated and remote sensing derived LAI. LAI product from remote sensing provides the edge of spatial, timely and accurate assessment of crop. For validating the yield and yield components, the error percentage between the observed and simulated data was calculated. The analysis shows that the model can be used to simulate crop yield and yield components for wheat and rice cultivar under different management practices. During the validation, the error percentage was less than 10%, indicating the utility of the calibrated model for climate risk assessment in the selected region.Keywords: simulation model, CERES-wheat and rice model, crop yield, genetic coefficient
Procedia PDF Downloads 30315796 Single Ended Primary Inductance Converter with Internal Model Controller
Authors: Fatih Suleyman Taskincan, Ahmet Karaarslan
Abstract:
In this article, the study and analysis of Single Ended Primary Inductance Converter (SEPIC) are presented for battery charging applications that will be used in military applications. The usage of this kind of converters come from its advantage of non-reverse polarity at outputs. As capacitors charge and discharge through inductance, peak current does not occur on capacitors. Therefore, the efficiency will be high compared to buck-boost converters. In this study, the converter (SEPIC) is designed to be operated with Internal Model Controller (IMC). The traditional controllers like Proportional Integral Controller are not preferred as its linearity behavior. Hence IMC is designed for this converter. This controller is a model-based control and provides more robustness and better set point monitoring. Moreover, it can be used for an unstable process where the conventional controller cannot handle the dynamic operation. Matlab/Simulink environment is used to simulate the converter and its controller, then, the results are shown and discussed.Keywords: DC/DC converter, single ended primary inductance converter, SEPIC, internal model controller, IMC, switched mode power supply
Procedia PDF Downloads 62515795 Study of Inhibition of the End Effect Based on AR Model Predict of Combined Data Extension and Window Function
Authors: Pan Hongxia, Wang Zhenhua
Abstract:
In this paper, the EMD decomposition in the process of endpoint effect adopted data based on AR model to predict the continuation and window function method of combining the two effective inhibition. Proven by simulation of the simulation signal obtained the ideal effect, then, apply this method to the gearbox test data is also achieved good effect in the process, for the analysis of the subsequent data processing to improve the calculation accuracy. In the end, under various working conditions for the gearbox fault diagnosis laid a good foundation.Keywords: gearbox, fault diagnosis, ar model, end effect
Procedia PDF Downloads 36515794 An Ontological Approach to Existentialist Theatre and Theatre of the Absurd in the Works of Jean-Paul Sartre and Samuel Beckett
Authors: Gülten Silindir Keretli
Abstract:
The aim of this study is to analyse the works of playwrights within the framework of existential philosophy. It is to observe the ontological existence in the plays of No Exit and Endgame. Literary works will be discussed separately in each section of this study. The despair of post-war generation of Europe problematized the ‘human condition’ in every field of literature which is the very product of social upheaval. With this concern in his mind, Sartre’s creative works portrayed man as a lonely being, burdened with terrifying freedom to choose and create his own meaning in an apparently meaningless world. The traces of the existential thought are to be found throughout the history of philosophy and literature. On the other hand, the theatre of the absurd is a form of drama showing the absurdity of the human condition and it is heavily influenced by the existential philosophy. Beckett is the most influential playwright of the theatre of the absurd. The themes and thoughts in his plays share many tenets of the existential philosophy. The existential philosophy posits the meaninglessness of existence and it regards man as being thrown into the universe and into desolate isolation. To overcome loneliness and isolation, the human ego needs recognition from the other people. Sartre calls this need of recognition as the need for ‘the Look’ (Le regard) from the Other. In this paper, existentialist philosophy and existentialist angst will be elaborated and then the works of existentialist theatre and theatre of absurd will be discussed within the framework of existential philosophy.Keywords: consciousness, existentialism, the notion of the absurd, the other
Procedia PDF Downloads 15715793 Media Planning Decisions and Preferences through a Goal Programming Model: An Application to a Media Campaign for a Mature Product in Italy
Authors: Cinzia Colapinto, Davide La Torre
Abstract:
Goal Programming (GP) and its variants were applied to marketing and specific marketing issues, such as media scheduling problems in the last decades. The concept of satisfaction functions has been widely utilized in the GP model to explicitly integrate the Decision-Maker’s preferences. These preferences can be guided by the available information regarding the decision-making situation. A GP model with satisfaction functions for media planning decisions is proposed and then illustrated through a case study related to a marketing/media campaign in the Italian market.Keywords: goal programming, satisfaction functions, media planning, tourism management
Procedia PDF Downloads 39815792 Measurement Tools of the Maturity Model for IT Service Outsourcing in Higher Education Institutions
Authors: Victoriano Valencia García, Luis Usero Aragonés, Eugenio J. Fernández Vicente
Abstract:
Nowadays, the successful implementation of ICTs is vital for almost any kind of organization. Good governance and ICT management are essential for delivering value, managing technological risks, managing resources and performance measurement. In addition, outsourcing is a strategic IT service solution which complements IT services provided internally in organizations. This paper proposes the measurement tools of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. These measurement tools allow independent validation and practical application in the field of higher education, using a questionnaire, metrics tables, and continuous improvement plan tables as part of the measurement process. Guidelines and standards are proposed in the model for facilitating adaptation to universities and achieving excellence in the outsourcing of IT services.Keywords: IT governance, IT management, IT services, outsourcing, maturity model, measurement tools
Procedia PDF Downloads 58915791 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 6515790 Use of Two-Dimensional Hydraulics Modeling for Design of Erosion Remedy
Authors: Ayoub. El Bourtali, Abdessamed.Najine, Amrou Moussa. Benmoussa
Abstract:
One of the main goals of river engineering is river training, which is defined as controlling and predicting the behavior of a river. It is taking effective measurements to eliminate all related risks and thus improve the river system. In some rivers, the riverbed continues to erode and degrade; therefore, equilibrium will never be reached. Generally, river geometric characteristics and riverbed erosion analysis are some of the most complex but critical topics in river engineering and sediment hydraulics; riverbank erosion is the second answering process in hydrodynamics, which has a major impact on the ecological chain and socio-economic process. This study aims to integrate the new computer technology that can analyze erosion and hydraulic problems through computer simulation and modeling. Choosing the right model remains a difficult and sensitive job for field engineers. This paper makes use of the 5.0.4 version of the HEC-RAS model. The river section is adopted according to the gauged station and the proximity of the adjustment. In this work, we will demonstrate how 2D hydraulic modeling helped clarify the design and cover visuals to set up depth and velocities at riverbanks and throughout advanced structures. The hydrologic engineering center's-river analysis system (HEC-RAS) 2D model was used to create a hydraulic study of the erosion model. The geometric data were generated from the 12.5-meter x 12.5-meter resolution digital elevation model. In addition to showing eroded or overturned river sections, the model output also shows patterns of riverbank changes, which can help us reduce problems caused by erosion.Keywords: 2D hydraulics model, erosion, floodplain, hydrodynamic, HEC-RAS, riverbed erosion, river morphology, resolution digital data, sediment
Procedia PDF Downloads 18815789 Numerical Simulation of the Kurtosis Effect on the EHL Problem
Authors: S. Gao, S. Srirattayawong
Abstract:
In this study, a computational fluid dynamics (CFD) model has been developed for studying the effect of surface roughness profile on the EHL problem. The cylinders contact geometry, meshing and calculation of the conservation of mass and momentum equations are carried out by using the commercial software packages ICEMCFD and ANSYS Fluent. The user defined functions (UDFs) for density, viscosity and elastic deformation of the cylinders as the functions of pressure and temperature have been defined for the CFD model. Three different surface roughness profiles are created and incorporated into the CFD model. It is found that the developed CFD model can predict the characteristics of fluid flow and heat transfer in the EHL problem, including the leading parameters such as the pressure distribution, minimal film thickness, viscosity, and density changes. The obtained results show that the pressure profile at the center of the contact area directly relates to the roughness amplitude. The rough surface with kurtosis value over 3 influences the fluctuated shape of pressure distribution higher than other cases.Keywords: CFD, EHL, kurtosis, surface roughness
Procedia PDF Downloads 31915788 Risk of Fatal and Non-Fatal Coronary Heart Disease and Stroke Events among Adult Patients with Hypertension: Basic Markov Model Inputs for Evaluating Cost-Effectiveness of Hypertension Treatment: Systematic Review of Cohort Studies
Authors: Mende Mensa Sorato, Majid Davari, Abbas Kebriaeezadeh, Nizal Sarrafzadegan, Tamiru Shibru, Behzad Fatemi
Abstract:
Markov model, like cardiovascular disease (CVD) policy model based simulation, is being used for evaluating the cost-effectiveness of hypertension treatment. Stroke, angina, myocardial infarction (MI), cardiac arrest, and all-cause mortality were included in this model. Hypertension is a risk factor for a number of vascular and cardiac complications and CVD outcomes. Objective: This systematic review was conducted to evaluate the comprehensiveness of this model across different regions globally. Methods: We searched articles written in the English language from PubMed/Medline, Ovid/Medline, Embase, Scopus, Web of Science, and Google scholar with a systematic search query. Results: Thirteen cohort studies involving a total of 2,165,770 (1,666,554 hypertensive adult population and 499,226 adults with treatment-resistant hypertension) were included in this scoping review. Hypertension is clearly associated with coronary heart disease (CHD) and stroke mortality, unstable angina, stable angina, MI, heart failure (HF), sudden cardiac death, transient ischemic attack, ischemic stroke, subarachnoid hemorrhage, intracranial hemorrhage, peripheral arterial disease (PAD), and abdominal aortic aneurism (AAA). Association between HF and hypertension is variable across regions. Treatment resistant hypertension is associated with a higher relative risk of developing major cardiovascular events and all-cause mortality when compared with non-resistant hypertension. However, it is not included in the previous CVD policy model. Conclusion: The CVD policy model used can be used in most regions for the evaluation of the cost-effectiveness of hypertension treatment. However, hypertension is highly associated with HF in Latin America, the Caribbean, Eastern Europe, and Sub-Saharan Africa. Therefore, it is important to consider HF in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment in these regions. We do not suggest the inclusion of PAD and AAA in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment due to a lack of sufficient evidence. Researchers should consider the effect of treatment-resistant hypertension either by including it in the basic model or during setting the model assumptions.Keywords: cardiovascular disease policy model, cost-effectiveness analysis, hypertension, systematic review, twelve major cardiovascular events
Procedia PDF Downloads 6915787 Efficiency of Using E-Wallets as Payment Method in Marikina City During COVID-19 Pandemic
Authors: Noel Paolo Domingo, James Paul Menina, Laurente Ferrer
Abstract:
Most people were forced to stay at home and limit their physical contact during the COVID-19 pandemic. Due to the situation, strict implementation of government policies and safety protocols encouraged consumers to utilize cashless or digital transactions through e-wallets. In this study, the researchers aim to investigate the efficiency of using e-wallets as a payment method during the COVID-19 pandemic in Marikina City. The study examined the efficiency of e-wallets in terms of Usefulness, Convenience, and Safety and Security based on respondents’ assessment. Questionnaires developed by the researchers were distributed to a total of 400 e-wallet users in Marikina City aged 15 years old and above to gather data by using a purposive sampling technique. The data collected was processed using SPSS version 26. Frequency, percentage, and mean were utilized to describe the profile of respondents and their assessment of e-wallets in terms of the three constructs. ANOVA and t-tests were also employed to test the significant differences in the respondent’s assessment when the demographic profile was considered. The study revealed that when it comes to usefulness, e-wallet is efficient while in terms of convenience, and safety and security, e-wallet has been proven to be very efficient. During the COVID-19 pandemic, utilizing e-wallets has been embraced by most consumers. By enhancing its features, more people will be satisfied with using e-wallets.Keywords: efficiency of e-wallets, usefulness, convenience, safety and security
Procedia PDF Downloads 13715786 Characterization of Zn-Ni Alloy Elaborated Under Low and High Magnetic Field Immersed in Corrosive Medium
Authors: Sabiha Chouchane, Azzedine Hani, Jean-Paul Chopart, Alexandra Levesque
Abstract:
The electrodeposition of Zn-Ni alloy is mostly studied for its high degree of corrosion and mechanical properties. In this work, the zinc–nickel alloy coatings elaborated from sulfate bath have been carried out under low and high applied magnetic field. The effect of alloy stuctural parameters upon corrosion behavior is studied. It has been found that the magnetically induced convection changes the phase composition, promoting the zinc phase in spite of the γ-Ni₅Zn₂₁. Low magnetic field acts also on the morphology of the deposits as a levelling agent and a refiner by lowering the deposit roughness Ra and the spot size. For alloy obtained with low magnetic field (up to 1T) superimposition, surface morphology modification has no significant influence on corrosion behavior whereas for low nickel content alloy, the modification of phase composition, induced by applied magnetic field, favours higher polarization resistance. When high magnetic field amplitude is involved (up to12T), the phase composition modifications are the same that for low applied B and the morphology is not largely modified. In this case, the hydrogen reduction current dramatically decreases that leads to a large shift of the corrosion potential. It is suggested that the surface reactivity of electrodeposited alloys depends on the magnetically induced convection that is efficient during the codeposition process.Keywords: magnetic field, Zn-Ni alloy, corrosion, corrosive medium
Procedia PDF Downloads 4915785 Standard Resource Parameter Based Trust Model in Cloud Computing
Authors: Shyamlal Kumawat
Abstract:
Cloud computing is shifting the approach IT capital are utilized. Cloud computing dynamically delivers convenient, on-demand access to shared pools of software resources, platform and hardware as a service through internet. The cloud computing model—made promising by sophisticated automation, provisioning and virtualization technologies. Users want the ability to access these services including infrastructure resources, how and when they choose. To accommodate this shift in the consumption model technology has to deal with the security, compatibility and trust issues associated with delivering that convenience to application business owners, developers and users. Absent of these issues, trust has attracted extensive attention in Cloud computing as a solution to enhance the security. This paper proposes a trusted computing technology through Standard Resource parameter Based Trust Model in Cloud Computing to select the appropriate cloud service providers. The direct trust of cloud entities is computed on basis of the interaction evidences in past and sustained on its present performances. Various SLA parameters between consumer and provider are considered in trust computation and compliance process. The simulations are performed using CloudSim framework and experimental results show that the proposed model is effective and extensible.Keywords: cloud, Iaas, Saas, Paas
Procedia PDF Downloads 32815784 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster
Authors: Trapti Sharma, Devesh Kumar Srivastava
Abstract:
This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.Keywords: hadoop, mapreduce, k-mediod, validation, verification
Procedia PDF Downloads 36515783 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform
Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo
Abstract:
The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper introduces an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.Keywords: energy-efficient, fog computing, IoT, telehealth
Procedia PDF Downloads 8615782 Petri Net Modeling and Simulation of a Call-Taxi System
Authors: T. Godwin
Abstract:
A call-taxi system is a type of taxi service where a taxi could be requested through a phone call or mobile app. A schematic functioning of a call-taxi system is modeled using Petri net, which provides the necessary conditions for a taxi to be assigned by a dispatcher to pick a customer as well as the conditions for the taxi to be released by the customer. A Petri net is a graphical modeling tool used to understand sequences, concurrences, and confluences of activities in the working of discrete event systems. It uses tokens on a directed bipartite multi-graph to simulate the activities of a system. The Petri net model is translated into a simulation model and a call-taxi system is simulated. The simulation model helps in evaluating the operation of a call-taxi system based on the fleet size as well as the operating policies for call-taxi assignment and empty call-taxi repositioning. The developed Petri net based simulation model can be used to decide the fleet size as well as the call-taxi assignment policies for a call-taxi system.Keywords: call-taxi, discrete event system, petri net, simulation modeling
Procedia PDF Downloads 42215781 Modeling Waiting and Service Time for Patients: A Case Study of Matawale Health Centre, Zomba, Malawi
Authors: Moses Aron, Elias Mwakilama, Jimmy Namangale
Abstract:
Spending more time on long queues for a basic service remains a common challenge to most developing countries, including Malawi. For health sector in particular, Out-Patient Department (OPD) experiences long queues. This puts the lives of patients at risk. However, using queuing analysis to under the nature of the problems and efficiency of service systems, such problems can be abated. Based on a kind of service, literature proposes different possible queuing models. However, unlike using generalized assumed models proposed by literature, use of real time case study data can help in deeper understanding the particular problem model and how such a model can vary from one day to the other and also from each case to another. As such, this study uses data obtained from one urban HC for BP, Pediatric and General OPD cases to investigate an average queuing time for patients within the system. It seeks to highlight the proper queuing model by investigating the kind of distributions functions over patient’s arrival time, inter-arrival time, waiting time and service time. Comparable with the standard set values by WHO, the study found that patients at this HC spend more waiting times than service times. On model investigation, different days presented different models ranging from an assumed M/M/1, M/M/2 to M/Er/2. As such, through sensitivity analysis, in general, a commonly assumed M/M/1 model failed to fit the data but rather an M/Er/2 demonstrated to fit well. An M/Er/3 model seemed to be good in terms of measuring resource utilization, proposing a need to increase medical personnel at this HC. However, an M/Er/4 showed to cause more idleness of human resources.Keywords: health care, out-patient department, queuing model, sensitivity analysis
Procedia PDF Downloads 43215780 An Improved Multiple Scattering Reflectance Model Based on Specular V-Cavity
Authors: Hongbin Yang, Mingxue Liao, Changwen Zheng, Mengyao Kong, Chaohui Liu
Abstract:
Microfacet-based reflection models are widely used to model light reflections for rough surfaces. Microfacet models have become the standard surface material building block for describing specular components with varying roughness; and yet, while they possess many desirable properties as well as produce convincing results, their design ignores important sources of scattering, which can cause a significant loss of energy. Specifically, they only simulate the single scattering on the microfacets and ignore the subsequent interactions. As the roughness increases, the interaction will become more and more important. So a multiple-scattering microfacet model based on specular V-cavity is presented for this important open problem. However, it spends much unnecessary rendering time because of setting the same number of scatterings for different roughness surfaces. In this paper, we design a geometric attenuation term G to compute the BRDF (Bidirectional reflection distribution function) of multiple scattering of rough surfaces. Moreover, we consider determining the number of scattering by deterministic heuristics for different roughness surfaces. As a result, our model produces a similar appearance of the objects with the state of the art model with significantly improved rendering efficiency. Finally, we derive a multiple scattering BRDF based on the original microfacet framework.Keywords: bidirectional reflection distribution function, BRDF, geometric attenuation term, multiple scattering, V-cavity model
Procedia PDF Downloads 11315779 Reinforcement Learning Optimization: Unraveling Trends and Advancements in Metaheuristic Algorithms
Authors: Rahul Paul, Kedar Nath Das
Abstract:
The field of machine learning (ML) is experiencing rapid development, resulting in a multitude of theoretical advancements and extensive practical implementations across various disciplines. The objective of ML is to facilitate the ability of machines to perform cognitive tasks by leveraging knowledge gained from prior experiences and effectively addressing complex problems, even in situations that deviate from previously encountered instances. Reinforcement Learning (RL) has emerged as a prominent subfield within ML and has gained considerable attention in recent times from researchers. This surge in interest can be attributed to the practical applications of RL, the increasing availability of data, and the rapid advancements in computing power. At the same time, optimization algorithms play a pivotal role in the field of ML and have attracted considerable interest from researchers. A multitude of proposals have been put forth to address optimization problems or improve optimization techniques within the domain of ML. The necessity of a thorough examination and implementation of optimization algorithms within the context of ML is of utmost importance in order to provide guidance for the advancement of research in both optimization and ML. This article provides a comprehensive overview of the application of metaheuristic evolutionary optimization algorithms in conjunction with RL to address a diverse range of scientific challenges. Furthermore, this article delves into the various challenges and unresolved issues pertaining to the optimization of RL models.Keywords: machine learning, reinforcement learning, loss function, evolutionary optimization techniques
Procedia PDF Downloads 7215778 Numerical Study on Parallel Rear-Spoiler on Super Cars
Authors: Anshul Ashu
Abstract:
Computers are applied to the vehicle aerodynamics in two ways. One of two is Computational Fluid Dynamics (CFD) and other is Computer Aided Flow Visualization (CAFV). Out of two CFD is chosen because it shows the result with computer graphics. The simulation of flow field around the vehicle is one of the important CFD applications. The flow field can be solved numerically using panel methods, k-ε method, and direct simulation methods. The spoiler is the tool in vehicle aerodynamics used to minimize unfavorable aerodynamic effects around the vehicle and the parallel spoiler is set of two spoilers which are designed in such a manner that it could effectively reduce the drag. In this study, the standard k-ε model of the simplified version of Bugatti Veyron, Audi R8 and Porsche 911 are used to simulate the external flow field. Flow simulation is done for variable Reynolds number. The flow simulation consists of three different levels, first over the model without a rear spoiler, second for over model with single rear spoiler, and third over the model with parallel rear-spoiler. The second and third level has following parameter: the shape of the spoiler, the angle of attack and attachment position. A thorough analysis of simulations results has been found. And a new parallel spoiler is designed. It shows a little improvement in vehicle aerodynamics with a decrease in vehicle aerodynamic drag and lift. Hence, it leads to good fuel economy and traction force of the model.Keywords: drag, lift, flow simulation, spoiler
Procedia PDF Downloads 49815777 De Novo Design of a Minimal Catalytic Di-Nickel Peptide Capable of Sustained Hydrogen Evolution
Authors: Saroj Poudel, Joshua Mancini, Douglas Pike, Jennifer Timm, Alexei Tyryshkin, Vikas Nanda, Paul Falkowski
Abstract:
On the early Earth, protein-metal complexes likely harvested energy from a reduced environment. These complexes would have been precursors to the metabolic enzymes of ancient organisms. Hydrogenase is an essential enzyme in most anaerobic organisms for the reduction and oxidation of hydrogen in the environment and is likely one of the earliest evolved enzymes. To attempt to reinvent a precursor to modern hydrogenase, we computationally designed a short thirteen amino acid peptide that binds the often-required catalytic transition metal Nickel in hydrogenase. This simple complex can achieve hundreds of hydrogen evolution cycles using light energy in a broad range of temperature and pH. Biophysical and structural investigations strongly indicate the peptide forms a di-nickel active site analogous to Acetyl-CoA synthase, an ancient protein central to carbon reduction in the Wood-Ljungdahl pathway and capable of hydrogen evolution. This work demonstrates that prior to the complex evolution of multidomain enzymes, early peptide-metal complexes could have catalyzed energy transfer from the environment on the early Earth and enabled the evolution of modern metabolismKeywords: hydrogenase, prebiotic enzyme, metalloenzyme, computational design
Procedia PDF Downloads 21515776 Predicting Returns Volatilities and Correlations of Stock Indices Using Multivariate Conditional Autoregressive Range and Return Models
Authors: Shay Kee Tan, Kok Haur Ng, Jennifer So-Kuen Chan
Abstract:
This paper extends the conditional autoregressive range (CARR) model to multivariate CARR (MCARR) model and further to the two-stage MCARR-return model to model and forecast volatilities, correlations and returns of multiple financial assets. The first stage model fits the scaled realised Parkinson volatility measures using individual series and their pairwise sums of indices to the MCARR model to obtain in-sample estimates and forecasts of volatilities for these individual and pairwise sum series. Then covariances are calculated to construct the fitted variance-covariance matrix of returns which are imputed into the stage-two return model to capture the heteroskedasticity of assets’ returns. We investigate different choices of mean functions to describe the volatility dynamics. Empirical applications are based on the Standard and Poor 500, Dow Jones Industrial Average and Dow Jones United States Financial Service Indices. Results show that the stage-one MCARR models using asymmetric mean functions give better in-sample model fits than those based on symmetric mean functions. They also provide better out-of-sample volatility forecasts than those using CARR models based on two robust loss functions with the scaled realised open-to-close volatility measure as the proxy for the unobserved true volatility. We also find that the stage-two return models with constant means and multivariate Student-t errors give better in-sample fits than the Baba, Engle, Kraft, and Kroner type of generalized autoregressive conditional heteroskedasticity (BEKK-GARCH) models. The estimates and forecasts of value-at-risk (VaR) and conditional VaR based on the best MCARR-return models for each asset are provided and tested using Kupiec test to confirm the accuracy of the VaR forecasts.Keywords: range-based volatility, correlation, multivariate CARR-return model, value-at-risk, conditional value-at-risk
Procedia PDF Downloads 9815775 Analysis of Aerodynamic Forces Acting on a Train Passing Through a Tornado
Authors: Masahiro Suzuki, Nobuyuki Okura
Abstract:
The crosswind effect on ground transportations has been extensively investigated for decades. The effect of tornado, however, has been hardly studied in spite of the fact that even heavy ground vehicles, namely, trains were overturned by tornadoes with casualties in the past. Therefore, aerodynamic effects of the tornado on the train were studied by several approaches in this study. First, an experimental facility was developed to clarify aerodynamic forces acting on a vehicle running through a tornado. Our experimental set-up consists of two apparatus. One is a tornado simulator, and the other is a moving model rig. PIV measurements showed that the tornado simulator can generate a swirling-flow field similar to those of the natural tornadoes. The flow field has the maximum tangential velocity of 7.4 m/s and the vortex core radius of 96 mm. The moving model rig makes a 1/40 scale model train of single-car/three-car unit run thorough the swirling flow with the maximum speed of 4.3 m/s. The model car has 72 pressure ports on its surface to estimate the aerodynamic forces. The experimental results show that the aerodynamic forces vary its magnitude and direction depends on the location of the vehicle in the flow field. Second, the aerodynamic forces on the train were estimated by using Rankin vortex model. The Rankin vortex model is a simple tornado model which widely used in the field of civil engineering. The estimated aerodynamic forces on the middle car were fairly good agreement with the experimental results. Effects of the vortex core radius and the path of the train on the aerodynamic forces were investigated using the Rankin vortex model. The results shows that the side and lift forces increases as the vortex core radius increases, while the yawing moment is maximum when the core radius is 0.3875 times of the car length. Third, a computational simulation was conducted to clarify the flow field around the train. The simulated results qualitatively agreed with the experimental ones.Keywords: aerodynamic force, experimental method, tornado, train
Procedia PDF Downloads 235