Search results for: General Linear Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22419

Search results for: General Linear Model

17799 A Mixed-Integer Nonlinear Program to Optimally Pace and Fuel Ultramarathons

Authors: Kristopher A. Pruitt, Justin M. Hill

Abstract:

The purpose of this research is to determine the pacing and nutrition strategies which minimize completion time and carbohydrate intake for athletes competing in ultramarathon races. The model formulation consists of a two-phase optimization. The first-phase mixed-integer nonlinear program (MINLP) determines the minimum completion time subject to the altitude, terrain, and distance of the race, as well as the mass and cardiovascular fitness of the athlete. The second-phase MINLP determines the minimum total carbohydrate intake required for the athlete to achieve the completion time prescribed by the first phase, subject to the flow of carbohydrates through the stomach, liver, and muscles. Consequently, the second phase model provides the optimal pacing and nutrition strategies for a particular athlete for each kilometer of a particular race. Validation of the model results over a wide range of athlete parameters against completion times for real competitive events suggests strong agreement. Additionally, the kilometer-by-kilometer pacing and nutrition strategies, the model prescribes for a particular athlete suggest unconventional approaches could result in lower completion times. Thus, the MINLP provides prescriptive guidance that athletes can leverage when developing pacing and nutrition strategies prior to competing in ultramarathon races. Given the highly-variable topographical characteristics common to many ultramarathon courses and the potential inexperience of many athletes with such courses, the model provides valuable insight to competitors who might otherwise fail to complete the event due to exhaustion or carbohydrate depletion.

Keywords: nutrition, optimization, pacing, ultramarathons

Procedia PDF Downloads 183
17798 Mathematical Modelling and AI-Based Degradation Analysis of the Second-Life Lithium-Ion Battery Packs for Stationary Applications

Authors: Farhad Salek, Shahaboddin Resalati

Abstract:

The production of electric vehicles (EVs) featuring lithium-ion battery technology has substantially escalated over the past decade, demonstrating a steady and persistent upward trajectory. The imminent retirement of electric vehicle (EV) batteries after approximately eight years underscores the critical need for their redirection towards recycling, a task complicated by the current inadequacy of recycling infrastructures globally. A potential solution for such concerns involves extending the operational lifespan of electric vehicle (EV) batteries through their utilization in stationary energy storage systems during secondary applications. Such adoptions, however, require addressing the safety concerns associated with batteries’ knee points and thermal runaways. This paper develops an accurate mathematical model representative of the second-life battery packs from a cell-to-pack scale using an equivalent circuit model (ECM) methodology. Neural network algorithms are employed to forecast the degradation parameters based on the EV batteries' aging history to develop a degradation model. The degradation model is integrated with the ECM to reflect the impacts of the cycle aging mechanism on battery parameters during operation. The developed model is tested under real-life load profiles to evaluate the life span of the batteries in various operating conditions. The methodology and the algorithms introduced in this paper can be considered the basis for Battery Management System (BMS) design and techno-economic analysis of such technologies.

Keywords: second life battery, electric vehicles, degradation, neural network

Procedia PDF Downloads 52
17797 Computational Methods in Official Statistics with an Example on Calculating and Predicting Diabetes Mellitus [DM] Prevalence in Different Age Groups within Australia in Future Years, in Light of the Aging Population

Authors: D. Hilton

Abstract:

An analysis of the Australian Diabetes Screening Study estimated undiagnosed diabetes mellitus [DM] prevalence in a high risk general practice based cohort. DM prevalence varied from 9.4% to 18.1% depending upon the diagnostic criteria utilised with age being a highly significant risk factor. Utilising the gold standard oral glucose tolerance test, the prevalence of DM was 22-23% in those aged >= 70 years and <15% in those aged 40-59 years. Opportunistic screening in Australian general practice potentially can identify many persons with undiagnosed type 2 DM. An Australian Bureau of Statistics document published three years ago, reported the highest rate of DM in men aged 65-74 years [19%] whereas the rate for women was highest in those over 75 years [13%]. If you consider that the Australian Bureau of Statistics report in 2007 found that 13% of the population was over 65 years of age and that this will increase to 23-25% by 2056 with a further projected increase to 25-28% by 2101, obviously this information has to be factored into the equation when age related diabetes prevalence predictions are calculated. This 10-15% proportional increase of elderly persons within the population demographics has dramatic implications for the estimated number of elderly persons with DM in these age groupings. Computational methodology showing the age related demographic changes reported in these official statistical documents will be done showing estimates for 2056 and 2101 for different age groups. This has relevance for future diabetes prevalence rates and shows that along with many countries worldwide Australia is facing an increasing pandemic. In contrast Japan is expected to have a decrease in the next twenty years in the number of persons with diabetes.

Keywords: epidemiological methods, aging, prevalence, diabetes mellitus

Procedia PDF Downloads 371
17796 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection

Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew

Abstract:

The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.

Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.

Procedia PDF Downloads 40
17795 Heat Source Temperature for Centered Heat Source on Isotropic Plate with Lower Surface Forced Cooling Using Neural Network and Three Different Materials

Authors: Fadwa Haraka, Ahmad Elouatouati, Mourad Taha Janan

Abstract:

In this study, we propose a neural network based method in order to calculate the heat source temperature of isotropic plate with lower surface forced cooling. To validate the proposed model, the heat source temperatures values will be compared to the analytical method -variables separation- and finite element model. The mathematical simulation is done through 3D numerical simulation by COMSOL software considering three different materials: Aluminum, Copper, and Graphite. The proposed method will lead to a formulation of the heat source temperature based on the thermal and geometric properties of the base plate.

Keywords: thermal model, thermal resistance, finite element simulation, neural network

Procedia PDF Downloads 348
17794 Computed Tomography Differential Diagnose of Intraventicular Masses in the Emergency Departemen

Authors: Angelis P. Barlampas

Abstract:

Purpose: A 29 years old woman presented in the emergency department with psychiatric symptoms. The psychiatrist ordered a computed tomography scan as part of a general examination. Material and methods: The CT showed bilateral enlarged choroid plexus structures mimicking papillomata and situated in the trigones of the lateral ventricles. The left choroid plexus was heavily calcified, but the right one has no any obvious calcifications. Results: It is well kown that any brain mass can present with behavioral changes and even psychiatric symptomatology. Papillomata of the ventricular system have been described to cause psychotic episodes. According to literature, choroid plexus papillomas are seldom neuroepithelial intraventricular tumors, which are benign and categorized as WHO grade 1 tumors. They are more common in the pediatric population, but they can occur in the adults, too1. In addition, the distinction between choroid plexus papilloma and carcinoma is very difficult and impossible by imagine alone. It can only be implied with more advanced imaging, such as arterial spin labeling and MRI. The final diagnosis is, of course, after surgical excision. The usual location in adults is the fourth ventricle, but in children, it is the lateral ventricles. Their imaging appearance is that of a solid vascular tumor, which enhances intensely after the intravenous administration of contrast material. One out of fourth tumors presents speckled calcifications1. In our case, there are symmetrically sized masses at the trigones, and there are no calcifications in one of them, whereas the other one is grossly calcified. Also, there is no obvious hydrocephalus or any other evidence of increased intracranial pressure. General conclusions: When there is a new psychiatric patient, someone must undergo any possible examination, and of course, a brain CT study should be done to exclude any rare organic causes that may be responsible for the disease.

Keywords: phycosis, intraventricular masses, CT, brain calcifications

Procedia PDF Downloads 55
17793 The Greek Version of the Southampton Nostalgia Scale: Psychometric Properties in Young Adults and Associations with Life Satisfaction, Positive and Negative Emotions, Time Perspective and Wellbeing

Authors: Eirini Petratou, Pezirkianidis Christos, Anastassios Stalikas

Abstract:

Nostalgia is characterized as a mental state of human’s emotional longing for the past that activates both positive and negative emotions. The bittersweet emotions that are activated by nostalgia aid psychological functions to humans and are depended on the type of stimuli that evoke nostalgia but also on the nostalgia activation context. In general, despite that nostalgia can be activated and experienced by all people; however, it differs both in terms of nostalgia experience but also nostalgia frequency. As a matter of fact, nostalgia experience along with nostalgia frequency differs according to the level of the nostalgia proneness. People with high nostalgia proneness tend to experience nostalgia more intensely and frequently than people with low nostalgia proneness. Nostalgia proneness is considered as a basic individual difference that affects the experience of nostalgia, and it can be measured by the Southampton Nostalgia Scale (SNS); a psychometric instrument that measures human’s nostalgia proneness consisting of seven questions that assess a person’s attitude towards nostalgia, the degree of experience or tendency to nostalgic feelings and the nostalgia frequency. In the current study, we translated, validated and calibrated the SNS in Greek population (N = 267). For the calibration process, we used several scales relevant to positive dimensions, such as life satisfaction, positive and negative emotions, time perspective and wellbeing. A confirmatory factor analysis revealed the factors that provide a good Southampton Nostalgia Proneness model fit for young adult Greek population.

Keywords: nostalgia proneness, nostalgia, psychometric instruments, psychometric properties

Procedia PDF Downloads 143
17792 Media Coverage on Child Sexual Abuse in Developing Countries

Authors: Hayam Qayyum

Abstract:

Print and Broadcast media are considered to be the most powerful social change agents and effective medium that can revolutionize the deter society into the civilized, responsible, composed society. Beside all major roles, imperative role of media is to highlight the human rights’ violation issues in order to provide awareness and to prevent society from the social evils and injustice. So, by pointing out the odds, media can lessen the magnitude of happenings within the society. For centuries, the “Silent Crime” i.e. Child Sexual Abuse (CSA) is gulping down the developing countries. This study will explore that how the appropriate Print and Broadcast media coverage can eliminate Child Sexual Abuse from the society. The immense challenge faced by the journalists today; is the accurate and ethical reporting and appropriate coverage to disclose the facts and deliver right message on the right time to lessen the social evils in the developing countries, by not harming the prestige of the victim. In case of CSA most of the victims and their families are not in favour to expose their children to media due to family norms and respect in the society. Media should focus on in depth information of CSA and use this coverage is to draw attention of the concern authorities to look into the matter for reforms and reviews in the system. Moreover, media as a change agent can bring such issue into the knowledge of the international community to make collective efforts with the affected country to eliminate the ‘Silent Crime’ from the society. The model country selected for this research paper is South Africa. The purpose of this research is not only to examine the existing reporting patterns and content of print and broadcast media coverage of South Africa but also aims to create awareness to eliminate Child Sexual abuse and indirectly to improve the condition of stake holders to overcome this social evil. The literature review method is used to formulate this paper. Trends of media content on CSA will be identified that how much amount and nature of information made available to the public through the media General view of media coverage on child sexual abuse in developing countries like India and Pakistan will also be focused. This research will be limited to the role of print and broadcast media coverage to eliminate child sexual abuse in South Africa. In developing countries, CSA issue needs to be addressed on immediate basis. The study will explore the CSA content of the most influential broadcast and print media outlets of South Africa. Broadcast media will be comprised of TV channels and print media will be comprised of influential newspapers. South Africa is selected as a model for this research paper.

Keywords: child sexual abuse, developing countries, print and broadcast media, South Africa

Procedia PDF Downloads 570
17791 A Phenomenological Approach to Computational Modeling of Analogy

Authors: José Eduardo García-Mendiola

Abstract:

In this work, a phenomenological approach to computational modeling of analogy processing is carried out. The paper goes through the consideration of the structure of the analogy, based on the possibility of sustaining the genesis of its elements regarding Husserl's genetic theory of association. Among particular processes which take place in order to get analogical inferences, there is one which arises crucial for enabling efficient base cases retrieval through long-term memory, namely analogical transference grounded on familiarity. In general, it has been argued that analogical reasoning is a way by which a conscious agent tries to determine or define a certain scope of objects and relationships between them using previous knowledge of other familiar domain of objects and relations. However, looking for a complete description of analogy process, a deeper consideration of phenomenological nature is required in so far, its simulation by computational programs is aimed. Also, one would get an idea of how complex it would be to have a fully computational account of the analogy elements. In fact, familiarity is not a result of a mere chain of repetitions of objects or events but generated insofar as the object/attribute or event in question is integrable inside a certain context that is taking shape as functionalities and functional approaches or perspectives of the object are being defined. Its familiarity is generated not by the identification of its parts or objective determinations as if they were isolated from those functionalities and approaches. Rather, at the core of such a familiarity between entities of different kinds lays the way they are functionally encoded. So, and hoping to make deeper inroads towards these topics, this essay allows us to consider that cognitive-computational perspectives can visualize, from the phenomenological projection of the analogy process reviewing achievements already obtained as well as exploration of new theoretical-experimental configurations towards implementation of analogy models in specific as well as in general purpose machines.

Keywords: analogy, association, encoding, retrieval

Procedia PDF Downloads 113
17790 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models

Authors: R. Hellmuth

Abstract:

The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.

Keywords: building information modeling, digital factory model, factory planning, maintenance

Procedia PDF Downloads 103
17789 Developing Laser Spot Position Determination and PRF Code Detection with Quadrant Detector

Authors: Mohamed Fathy Heweage, Xiao Wen, Ayman Mokhtar, Ahmed Eldamarawy

Abstract:

In this paper, we are interested in modeling, simulation, and measurement of the laser spot position with a quadrant detector. We enhance detection and tracking of semi-laser weapon decoding system based on microcontroller. The system receives the reflected pulse through quadrant detector and processes the laser pulses through a processing circuit, a microcontroller decoding laser pulse reflected by the target. The seeker accuracy will be enhanced by the decoding system, the laser detection time based on the receiving pulses number is reduced, a gate is used to limit the laser pulse width. The model is implemented based on Pulse Repetition Frequency (PRF) technique with two microcontroller units (MCU). MCU1 generates laser pulses with different codes. MCU2 decodes the laser code and locks the system at the specific code. The codes EW selected based on the two selector switches. The system is implemented and tested in Proteus ISIS software. The implementation of the full position determination circuit with the detector is produced. General system for the spot position determination was performed with the laser PRF for incident radiation and the mechanical system for adjusting system at different angles. The system test results show that the system can detect the laser code with only three received pulses based on the narrow gate signal, and good agreement between simulation and measured system performance is obtained.

Keywords: four quadrant detector, pulse code detection, laser guided weapons, pulse repetition frequency (PRF), Atmega 32 microcontrollers

Procedia PDF Downloads 379
17788 Life Cycle Assessment of Almond Processing: Off-ground Harvesting Scenarios

Authors: Jessica Bain, Greg Thoma, Marty Matlock, Jeyam Subbiah, Ebenezer Kwofie

Abstract:

The environmental impact and particulate matter emissions (PM) associated with the production and packaging of 1 kg of almonds were evaluated using life cycle assessment (LCA). The assessment began at the point of ready to harvest with a system boundary was a cradle-to-gate assessment of almond packaging in California. The assessment included three scenarios of off-ground harvesting of almonds. The three general off-ground harvesting scenarios with variations include the harvested almonds solar dried on a paper tarp in the orchard, the harvested almonds solar dried on the floor in a separate lot, and the harvested almonds dried mechanically. The life cycle inventory (LCI) data for almond production were based on previously published literature and data provided by Almond Board of California (ABC). The ReCiPe 2016 method was used to calculate the midpoint impacts. Using consequential LCA model, the global warming potential (GWP) for the three harvesting scenarios are 2.90, 2.86, and 3.09 kg CO2 eq/ kg of packaged almond for scenarios 1, 2a, and 3a, respectively. The global warming potential for conventional harvesting method was 2.89 kg CO2 eq/ kg of packaged almond. The particulate matter emissions for each scenario per hectare for each off-ground harvesting scenario is 77.14, 9.56, 66.86, and 8.75 for conventional harvesting and scenarios 1, 2, and 3, respectively. The most significant contributions to the overall emissions were from almond production. The farm gate almond production had a global warming potential of 2.12 kg CO2 eq/ kg of packaged almond, approximately 73% of the overall emissions. Based on comparisons between the GWP and PM emissions, scenario 2a was the best tradeoff between GHG and PM production.

Keywords: life cycle assessment, low moisture foods, sustainability, LCA

Procedia PDF Downloads 79
17787 Stock Price Prediction with 'Earnings' Conference Call Sentiment

Authors: Sungzoon Cho, Hye Jin Lee, Sungwhan Jeon, Dongyoung Min, Sungwon Lyu

Abstract:

Major public corporations worldwide use conference calls to report their quarterly earnings. These 'earnings' conference calls allow for questions from stock analysts. We investigated if it is possible to identify sentiment from the call script and use it to predict stock price movement. We analyzed call scripts from six companies, two each from Korea, China and Indonesia during six years 2011Q1 – 2017Q2. Random forest with Frequency-based sentiment scores using Loughran MacDonald Dictionary did better than control model with only financial indicators. When the stock prices went up 20 days from earnings release, our model predicted correctly 77% of time. When the model predicted 'up,' actual stock prices went up 65% of time. This preliminary result encourages us to investigate advanced sentiment scoring methodologies such as topic modeling, auto-encoder, and word2vec variants.

Keywords: earnings call script, random forest, sentiment analysis, stock price prediction

Procedia PDF Downloads 287
17786 Experimental and Semi-Analytical Investigation of Wave Interaction with Double Vertical Slotted Walls

Authors: H. Ahmed, A. Schlenkhoff, R. Rousta, R. Abdelaziz

Abstract:

Vertical slotted walls can be used as permeable breakwaters to provide economical and environmental protection from undesirable waves and currents inside the port. The permeable breakwaters are partially protection and have been suggested to overcome the environmental disadvantages of fully protection breakwaters. For regular waves a semi-analytical model is based on an eigenfunction expansion method and utilizes a boundary condition at the surface of each wall are developed to detect the energy dissipation through the slots. Extensive laboratory tests are carried out to validate the semi-analytic models. The structure of the physical model contains two walls and it consists of impermeable upper and lower part, where the draft is based a decimal multiple of the total depth. The middle part is permeable with a porosity of 50%. The second barrier is located at a distant of 0.5, 1, 1.5 and 2 times of the water depth from the first one. A comparison of the theoretical results with previous studies and experimental measurements of the present study show a good agreement and that, the semi-analytical model is able to adequately reproduce most the important features of the experiment.

Keywords: permeable breakwater, double vertical slotted walls, semi-analytical model, transmission coefficient, reflection coefficient, energy dissipation coefficient

Procedia PDF Downloads 380
17785 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life

Authors: Desplanches Maxime

Abstract:

Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.

Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression

Procedia PDF Downloads 63
17784 Designing an Effective Accountability Model for Islamic Azad University Using the Qualitative Approach of Grounded Theory

Authors: Davoud Maleki, Neda Zamani

Abstract:

The present study aims at exploring the effective accountability model of Islamic Azad University using a qualitative approach of grounded theory. The data of this study were obtained from semi-structured interviews with 25 professors and scholars in Islamic Azad University of Tehran who were selected by theoretical sampling method. In the data analysis, the stepwise method and Strauss and Corbin analytical methods (1992) were used. After identification of the main component (balanced response to stakeholders’ needs) and using it to bring the categories together, expressions and ideas representing the relationships between the main and subcomponents, and finally, the revealed components were categorized into six dimensions of the paradigm model, with the relationships among them, including causal conditions (7 components), main component (balanced response to stakeholders’ needs), strategies (5 components), environmental conditions (5 components), intervention features (4 components), and consequences (3 components). Research findings show an exploratory model for describing the relationships between causal conditions, main components, accountability strategies, environmental conditions, university environmental features, and that consequences.

Keywords: accountability, effectiveness, Islamic Azad University, grounded theory

Procedia PDF Downloads 79
17783 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 194
17782 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 138
17781 Estimation of Relative Subsidence of Collapsible Soils Using Electromagnetic Measurements

Authors: Henok Hailemariam, Frank Wuttke

Abstract:

Collapsible soils are weak soils that appear to be stable in their natural state, normally dry condition, but rapidly deform under saturation (wetting), thus generating large and unexpected settlements which often yield disastrous consequences for structures unwittingly built on such deposits. In this study, a prediction model for the relative subsidence of stressed collapsible soils based on dielectric permittivity measurement is presented. Unlike most existing methods for soil subsidence prediction, this model does not require moisture content as an input parameter, thus providing the opportunity to obtain accurate estimation of the relative subsidence of collapsible soils using dielectric measurement only. The prediction model is developed based on an existing relative subsidence prediction model (which is dependent on soil moisture condition) and an advanced theoretical frequency and temperature-dependent electromagnetic mixing equation (which effectively removes the moisture content dependence of the original relative subsidence prediction model). For large scale sub-surface soil exploration purposes, the spatial sub-surface soil dielectric data over wide areas and high depths of weak (collapsible) soil deposits can be obtained using non-destructive high frequency electromagnetic (HF-EM) measurement techniques such as ground penetrating radar (GPR). For laboratory or small scale in-situ measurements, techniques such as an open-ended coaxial line with widely applicable time domain reflectometry (TDR) or vector network analysers (VNAs) are usually employed to obtain the soil dielectric data. By using soil dielectric data obtained from small or large scale non-destructive HF-EM investigations, the new model can effectively predict the relative subsidence of weak soils without the need to extract samples for moisture content measurement. Some of the resulting benefits are the preservation of the undisturbed nature of the soil as well as a reduction in the investigation costs and analysis time in the identification of weak (problematic) soils. The accuracy of prediction of the presented model is assessed by conducting relative subsidence tests on a collapsible soil at various initial soil conditions and a good match between the model prediction and experimental results is obtained.

Keywords: collapsible soil, dielectric permittivity, moisture content, relative subsidence

Procedia PDF Downloads 356
17780 Hybrid Risk Assessment Model for Construction Based on Multicriteria Decision Making Methods

Authors: J. Tamosaitiene

Abstract:

The article focuses on the identification and classification of key risk management criteria that represent the most important sustainability aspects of the construction industry. The construction sector is one of the most important sectors in Lithuania. Nowadays, the assessment of the risk level of a construction project is especially important for the quality of construction projects, the growth of enterprises and the sector. To establish the most important criteria for successful growth of the sector, a questionnaire for experts was developed. The analytic hierarchy process (AHP), the expert judgement method and other multicriteria decision making (MCDM) methods were used to develop the hybrid model. The results were used to develop an integrated knowledge system for the measurement of a risk level particular to construction projects. The article presents a practical case that details the developed system, sustainable aspects, and risk assessment.

Keywords: risk, system, model, construction

Procedia PDF Downloads 162
17779 Job Characteristics, Emotion Regulation and University Teachers' Well-Being: A Job Demands-Resources Analysis

Authors: Jiying Han

Abstract:

Teaching is widely known to be an emotional endeavor, and teachers’ ability to regulate their emotions is important for their well-being and the effectiveness of their classroom management. Considering that teachers’ emotion regulation is an underexplored issue in the field of educational research, some studies have attempted to explore the role of emotion regulation in teachers’ work and to explore the links between teachers’ emotion regulation, job characteristics, and well-being, based on the Job Demands-Resources (JD-R) model. However, those studies targeted primary or secondary teachers. So far, very little is known about the relationships between university teachers’ emotion regulation and its antecedents and effects on teacher well-being. Based on the job demands-resources model and emotion regulation theory, this study examined the relationships between job characteristics of university teaching (i.e., emotional job demands and teaching support), emotion regulation strategies (i.e., reappraisal and suppression), and university teachers’ well-being. Data collected from a questionnaire survey of 643 university teachers in China were analysed. The results indicated that (1) both emotional job demands and teaching support had desirable effects on university teachers’ well-being; (2) both emotional job demands and teaching support facilitated university teachers’ use of reappraisal strategies; and (3) reappraisal was beneficial to university teachers’ well-being, whereas suppression was harmful. These findings support the applicability of the job demands-resources model to the contexts of higher education and highlight the mediating role of emotion regulation.

Keywords: emotional job demands, teaching support, emotion regulation strategies, the job demands-resources model

Procedia PDF Downloads 148
17778 Attention-Based Spatio-Temporal Approach for Fire and Smoke Detection

Authors: Alireza Mirrashid, Mohammad Khoshbin, Ali Atghaei, Hassan Shahbazi

Abstract:

In various industries, smoke and fire are two of the most important threats in the workplace. One of the common methods for detecting smoke and fire is the use of infrared thermal and smoke sensors, which cannot be used in outdoor applications. Therefore, the use of vision-based methods seems necessary. The problem of smoke and fire detection is spatiotemporal and requires spatiotemporal solutions. This paper presents a method that uses spatial features along with temporal-based features to detect smoke and fire in the scene. It consists of three main parts; the task of each part is to reduce the error of the previous part so that the final model has a robust performance. This method also uses transformer modules to increase the accuracy of the model. The results of our model show the proper performance of the proposed approach in solving the problem of smoke and fire detection and can be used to increase workplace safety.

Keywords: attention, fire detection, smoke detection, spatio-temporal

Procedia PDF Downloads 194
17777 Prediction of Malawi Rainfall from Global Sea Surface Temperature Using a Simple Multiple Regression Model

Authors: Chisomo Patrick Kumbuyo, Katsuyuki Shimizu, Hiroshi Yasuda, Yoshinobu Kitamura

Abstract:

This study deals with a way of predicting Malawi rainfall from global sea surface temperature (SST) using a simple multiple regression model. Monthly rainfall data from nine stations in Malawi grouped into two zones on the basis of inter-station rainfall correlations were used in the study. Zone 1 consisted of Karonga and Nkhatabay stations, located in northern Malawi; and Zone 2 consisted of Bolero, located in northern Malawi; Kasungu, Dedza, Salima, located in central Malawi; Mangochi, Makoka and Ngabu stations located in southern Malawi. Links between Malawi rainfall and SST based on statistical correlations were evaluated and significant results selected as predictors for the regression models. The predictors for Zone 1 model were identified from the Atlantic, Indian and Pacific oceans while those for Zone 2 were identified from the Pacific Ocean. The correlation between the fit of predicted and observed rainfall values of the models were satisfactory with r=0.81 and 0.54 for Zone 1 and 2 respectively (significant at less than 99.99%). The results of the models are in agreement with other findings that suggest that SST anomalies in the Atlantic, Indian and Pacific oceans have an influence on the rainfall patterns of Southern Africa.

Keywords: Malawi rainfall, forecast model, predictors, SST

Procedia PDF Downloads 383
17776 Transdisciplinarity Research Approach and Transit-Oriented Development Model for Urban Development Integration in South African Cities

Authors: Thendo Mafame

Abstract:

There is a need for academic research to focus on solving or contributing to solving real-world societal problems. Transdisciplinary research (TDR) provides a way to produce functional and applicable research findings, which can be used to advance developmental causes. This TDR study explores ways in which South Africa’s spatial divide, entrenched through decades of discriminatory planning policies, can be restructured to bring about equitable access to places of employment, business, leisure, and service for previously marginalised South Africans. It does by exploring the potential of the transit-orientated development (TOD) model to restructure and revitalise urban spaces in a collaborative model. The study focuses, through a case study, on the Du Toit station precinct in the town of Stellenbosch, on the peri-urban edge of the city of Cape Town, South Africa. The TOD model is increasingly viewed as an effective strategy for creating sustainable urban redevelopment initiatives, and it has been deployed successfully in other parts of the world. The model, which emphasises development density, diversity of land-use and infrastructure and transformative design, is customisable to a variety of country contexts. This study made use of case study approach with mixed methods to collect and analyse data. Various research methods used include the above-mentioned focus group discussions and interviews, as well as observation, transect walks This research contributes to the professional development of TDR studies that are focused on urbanisation issues.

Keywords: case study, integrated urban development, land-use, stakeholder collaboration, transit-oriented development, transdisciplinary research

Procedia PDF Downloads 127
17775 Electro-Thermo-Mechanical Behaviour of Functionally Graded Material Usage in Lead Acid Storage Batteries and the Benefits

Authors: Sandeep Das

Abstract:

Terminal post is one of the most important features of a Battery. The design and manufacturing of post are very much critical especially when threaded inserts (Bolt-on type) are used since all the collected energy is delivered from the lead part to the threaded insert (Cu or Cu alloy). Any imperfection at the interface may cause Voltage drop, high resistance, high heat generation, etc. This may be because of sudden change of material properties from lead to Cu alloys. To avoid this problem, a scheme of material gradation is proposed for achieving continuous variation of material properties for the Post used in commercially available lead acid battery. The Functionally graded (FG) material for the post is considered to be composed of different layers of homogeneous material. The volume fraction of the materials used corresponding to each layer is calculated by considering its variation along the direction of current flow (z) according to a power law. Accordingly, the effective properties of the homogeneous layers are estimated and the Post composed of this FG material is modeled using the commercially available ANSYS software. The solid 186 layered structural solid element has been used for discretization of the model of the FG Post. A thermal electric analysis is performed on the layered FG model. The model developed has been validated by comparing the results of the existing Post model& experimental analysis

Keywords: ANSYS, functionally graded material, lead-acid battery, terminal post

Procedia PDF Downloads 129
17774 A National Survey of Clinical Psychology Graduate Student Attitudes toward Psychotherapy Treatment Manuals: A Replication Study

Authors: B. Bergström, A. Ladd, A. Jones, L. Rosso, P. Michael

Abstract:

Attitudes toward treatment manuals serve as a meaningful predictor of general attitudes toward evidence-based practice. Despite demonstrating high effectiveness in treating many mental disorders, manualized treatments have been underutilized by practitioners. Thus, one can assess the state of the field regarding the adoption of evidence-based practices by surveying practitioner attitudes towards manualized treatments. This study is an adapted replication that assesses psychology graduate student attitudes towards manualized treatments, as a general marker for attitudes towards evidence-based practice. Training programs provide future clinicians with the foundation for critical skills in clinical practice. Research demonstrates that post-graduate continuing education has little to no effect on clinical practice; thus, graduate programs serve as the primary, and often final platform for all future practice. However, there are little empirical data identifying the attitudes and training of graduate students in utilizing manualized treatments. The empirical analysis of this study indicates an increase in positive attitudes among graduate student attitudes towards manualized treatments (within the United States), when compared to past surveys of professional psychologists. Findings from this study may inform graduate programs of barriers for students in developing positive attitudes toward manualized treatments and evidence-based practice. This study also serves as a preliminary predictor of the state-of-the field, in regards to professional psychologists attitudes towards evidence-based practice, if attitudes remain stable. This study indicates that the attitudes toward utilizing evidence-based practices, such as treatment manuals, has become more positive since year 2000.

Keywords: exposure therapy, evidence based practice, manualized treatments, student attitudes

Procedia PDF Downloads 159
17773 Interest Rate Prediction with Taylor Rule

Authors: T. Bouchabchoub, A. Bendahmane, A. Haouriqui, N. Attou

Abstract:

This paper presents simulation results of Forex predicting model equations in order to give approximately a prevision of interest rates. First, Hall-Taylor (HT) equations have been used with Taylor rule (TR) to adapt them to European and American Forex Markets. Indeed, initial Taylor Rule equation is conceived for all Forex transactions in every States: It includes only one equation and six parameters. Here, the model has been used with Hall-Taylor equations, initially including twelve equations which have been reduced to only three equations. Analysis has been developed on the following base macroeconomic variables: Real change rate, investment wages, anticipated inflation, realized inflation, real production, interest rates, gap production and potential production. This model has been used to specifically study the impact of an inflation shock on macroeconomic director interest rates.

Keywords: interest rate, Forex, Taylor rule, production, European Central Bank (ECB), Federal Reserve System (FED).

Procedia PDF Downloads 519
17772 A Translation Criticism of the Persian Translation of “A**Hole No More” Written by Xavier Crement

Authors: Mehrnoosh Pirhayati

Abstract:

Translation can be affected by different meta-textual factors of target context such as ideology, politics, and culture. So, the rule of fidelity, or being faithful to the source text, can be ignored by the translator. On the other hand, critical discourse analysis, derived from applied linguistics, is entered into the field of translation studies and used by scholars for revealing hidden deviations and possible roots of manipulations. This study focused on the famous Persian translation of the bestseller book, “A**hole No More,” written by XavierCrement 1990, performed by Mahmud Farjami to comparatively and critically analyze it with its corresponding English original book. The researcher applied Pirhayati’s model and framework of translation criticism at the textual and semiotic levels for this qualitative study. It should be noted that Kress and Van Leeuwen’s semiotic model, along with Machin’s model of typographical analysis, was also used at the semiotic level. The results of the comparisons and analyses indicate thatthis Persian translation of the book is affected by the factors of ideology and economics and reveal that the Islamic attitude causes the translator to employ some strategies such as substitution and deletion. Those who may benefit from this research are translation trainers, students of translation studies, critics, and scholars.

Keywords: farjami (2013), Ideology, manipulation, pirhayati's (2013) model of translation criticism, Xavier crement (1990)

Procedia PDF Downloads 210
17771 Clarifications on the Damping Mechanism Related to the Hunting Motion of the Wheel Axle of a High-Speed Railway Vehicle

Authors: Barenten Suciu

Abstract:

In order to explain the damping mechanism, related to the hunting motion of the wheel axle of a high-speed railway vehicle, a generalized dynamic model is proposed. Based on such model, analytic expressions for the damping coefficient and damped natural frequency are derived, without imposing restrictions on the ratio between the lateral and vertical creep coefficients. Influence of the travelling speed, wheel conicity, dimensionless mass of the wheel axle, ratio of the creep coefficients, ratio of the track span to the yawing diameter, etc. on the damping coefficient and damped natural frequency, is clarified.

Keywords: high-speed railway vehicle, hunting motion, wheel axle, damping, creep, vibration model, analysis.

Procedia PDF Downloads 289
17770 AquaCrop Model Simulation for Water Productivity of Teff (Eragrostic tef): A Case Study in the Central Rift Valley of Ethiopia

Authors: Yenesew Mengiste Yihun, Abraham Mehari Haile, Teklu Erkossa, Bart Schultz

Abstract:

Teff (Eragrostic tef) is a staple food in Ethiopia. The local and international demand for the crop is ever increasing pushing the current price five times compared with that in 2006. To meet this escalating demand increasing production including using irrigation is imperative. Optimum application of irrigation water, especially in semi-arid areas is profoundly important. AquaCrop model application in irrigation water scheduling and simulation of water productivity helps both irrigation planners and agricultural water managers. This paper presents simulation and evaluation of AquaCrop model in optimizing the yield and biomass response to variation in timing and rate of irrigation water application. Canopy expansion, canopy senescence and harvest index are the key physiological processes sensitive to water stress. For full irrigation water application treatment there was a strong relationship between the measured and simulated canopy and biomass with r2 and d values of 0.87 and 0.96 for canopy and 0.97 and 0.74 for biomass, respectively. However, the model under estimated the simulated yield and biomass for higher water stress level. For treatment receiving full irrigation the harvest index value obtained were 29%. The harvest index value shows generally a decreasing trend under water stress condition. AquaCrop model calibration and validation using the dry season field experiments of 2010/2011 and 2011/2012 shows that AquaCrop adequately simulated the yield response to different irrigation water scenarios. We conclude that the AquaCrop model can be used in irrigation water scheduling and optimizing water productivity of Teff grown under water scarce semi-arid conditions.

Keywords: AquaCrop, climate smart agriculture, simulation, teff, water security, water stress regions

Procedia PDF Downloads 395