Search results for: cloud service models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10573

Search results for: cloud service models

8593 Developing A Third Degree Of Freedom For Opinion Dynamics Models Using Scales

Authors: Dino Carpentras, Alejandro Dinkelberg, Michael Quayle

Abstract:

Opinion dynamics models use an agent-based modeling approach to model people’s opinions. Model's properties are usually explored by testing the two 'degrees of freedom': the interaction rule and the network topology. The latter defines the connection, and thus the possible interaction, among agents. The interaction rule, instead, determines how agents select each other and update their own opinion. Here we show the existence of the third degree of freedom. This can be used for turning one model into each other or to change the model’s output up to 100% of its initial value. Opinion dynamics models represent the evolution of real-world opinions parsimoniously. Thus, it is fundamental to know how real-world opinion (e.g., supporting a candidate) could be turned into a number. Specifically, we want to know if, by choosing a different opinion-to-number transformation, the model’s dynamics would be preserved. This transformation is typically not addressed in opinion dynamics literature. However, it has already been studied in psychometrics, a branch of psychology. In this field, real-world opinions are converted into numbers using abstract objects called 'scales.' These scales can be converted one into the other, in the same way as we convert meters to feet. Thus, in our work, we analyze how this scale transformation may affect opinion dynamics models. We perform our analysis both using mathematical modeling and validating it via agent-based simulations. To distinguish between scale transformation and measurement error, we first analyze the case of perfect scales (i.e., no error or noise). Here we show that a scale transformation may change the model’s dynamics up to a qualitative level. Meaning that a researcher may reach a totally different conclusion, even using the same dataset just by slightly changing the way data are pre-processed. Indeed, we quantify that this effect may alter the model’s output by 100%. By using two models from the standard literature, we show that a scale transformation can transform one model into the other. This transformation is exact, and it holds for every result. Lastly, we also test the case of using real-world data (i.e., finite precision). We perform this test using a 7-points Likert scale, showing how even a small scale change may result in different predictions or a number of opinion clusters. Because of this, we think that scale transformation should be considered as a third-degree of freedom for opinion dynamics. Indeed, its properties have a strong impact both on theoretical models and for their application to real-world data.

Keywords: degrees of freedom, empirical validation, opinion scale, opinion dynamics

Procedia PDF Downloads 156
8592 Understanding the Role of Gas Hydrate Morphology on the Producibility of a Hydrate-Bearing Reservoir

Authors: David Lall, Vikram Vishal, P. G. Ranjith

Abstract:

Numerical modeling of gas production from hydrate-bearing reservoirs requires the solution of various thermal, hydrological, chemical, and mechanical phenomena in a coupled manner. Among the various reservoir properties that influence gas production estimates, the distribution of permeability across the domain is one of the most crucial parameters since it determines both heat transfer and mass transfer. The aspect of permeability in hydrate-bearing reservoirs is particularly complex compared to conventional reservoirs since it depends on the saturation of gas hydrates and hence, is dynamic during production. The dependence of permeability on hydrate saturation is mathematically represented using permeability-reduction models, which are specific to the expected morphology of hydrate accumulations (such as grain-coating or pore-filling hydrates). In this study, we demonstrate the impact of various permeability-reduction models, and consequently, different morphologies of hydrate deposits on the estimates of gas production using depressurization at the reservoir scale. We observe significant differences in produced water volumes and cumulative mass of produced gas between the models, thereby highlighting the uncertainty in production behavior arising from the ambiguity in the prevalent gas hydrate morphology.

Keywords: gas hydrate morphology, multi-scale modeling, THMC, fluid flow in porous media

Procedia PDF Downloads 221
8591 Hybrid Direct Numerical Simulation and Large Eddy Simulating Wall Models Approach for the Analysis of Turbulence Entropy

Authors: Samuel Ahamefula

Abstract:

Turbulent motion is a highly nonlinear and complex phenomenon, and its modelling is still very challenging. In this study, we developed a hybrid computational approach to accurately simulate fluid turbulence phenomenon. The focus is coupling and transitioning between Direct Numerical Simulation (DNS) and Large Eddy Simulating Wall Models (LES-WM) regions. In the framework, high-order fidelity fluid dynamical methods are utilized to simulate the unsteady compressible Navier-Stokes equations in the Eulerian format on the unstructured moving grids. The coupling and transitioning of DNS and LES-WM are conducted through the linearly staggered Dirichlet-Neumann coupling scheme. The high-fidelity framework is verified and validated based on namely, DNS ability for capture full range of turbulent scales, giving accurate results and LES-WM efficiency in simulating near-wall turbulent boundary layer by using wall models.

Keywords: computational methods, turbulence modelling, turbulence entropy, navier-stokes equations

Procedia PDF Downloads 102
8590 Comparison of Spiking Neuron Models in Terms of Biological Neuron Behaviours

Authors: Fikret Yalcinkaya, Hamza Unsal

Abstract:

To understand how neurons work, it is required to combine experimental studies on neural science with numerical simulations of neuron models in a computer environment. In this regard, the simplicity and applicability of spiking neuron modeling functions have been of great interest in computational neuron science and numerical neuroscience in recent years. Spiking neuron models can be classified by exhibiting various neuronal behaviors, such as spiking and bursting. These classifications are important for researchers working on theoretical neuroscience. In this paper, three different spiking neuron models; Izhikevich, Adaptive Exponential Integrate Fire (AEIF) and Hindmarsh Rose (HR), which are based on first order differential equations, are discussed and compared. First, the physical meanings, derivatives, and differential equations of each model are provided and simulated in the Matlab environment. Then, by selecting appropriate parameters, the models were visually examined in the Matlab environment and it was aimed to demonstrate which model can simulate well-known biological neuron behaviours such as Tonic Spiking, Tonic Bursting, Mixed Mode Firing, Spike Frequency Adaptation, Resonator and Integrator. As a result, the Izhikevich model has been shown to perform Regular Spiking, Continuous Explosion, Intrinsically Bursting, Thalmo Cortical, Low-Threshold Spiking and Resonator. The Adaptive Exponential Integrate Fire model has been able to produce firing patterns such as Regular Ignition, Adaptive Ignition, Initially Explosive Ignition, Regular Explosive Ignition, Delayed Ignition, Delayed Regular Explosive Ignition, Temporary Ignition and Irregular Ignition. The Hindmarsh Rose model showed three different dynamic neuron behaviours; Spike, Burst and Chaotic. From these results, the Izhikevich cell model may be preferred due to its ability to reflect the true behavior of the nerve cell, the ability to produce different types of spikes, and the suitability for use in larger scale brain models. The most important reason for choosing the Adaptive Exponential Integrate Fire model is that it can create rich ignition patterns with fewer parameters. The chaotic behaviours of the Hindmarsh Rose neuron model, like some chaotic systems, is thought to be used in many scientific and engineering applications such as physics, secure communication and signal processing.

Keywords: Izhikevich, adaptive exponential integrate fire, Hindmarsh Rose, biological neuron behaviours, spiking neuron models

Procedia PDF Downloads 183
8589 Designing an Online Case-Based Library for Technology Integration in Teacher Education

Authors: Mustafa Tevfik Hebebci, Sirin Kucuk, Ismail Celik, A. Oguz Akturk, Ismail Sahin, Fetah Eren

Abstract:

The purpose of this paper is to introduce an interactive online case-study library website developed in a national project. The design goal of the website is to provide interactive, enhanced, case-based and online educational resource for educators through the purpose and within the scope of a national project. The ADDIE instructional design model was used in the development of the website for interactive case-based library. This library is developed on a web-based platform, which is important in terms of manageability, accessibility, and updateability of data. Users are able to sort the displayed case-studies by their titles, dates, ratings, view counts, etc. The usability test is used and the expert opinion is taken for the evaluation of the website. This website is a tool to integrate technology into education. It is believed that this website will be beneficial for pre-service and in-service teachers in terms of their professional developments.

Keywords: ADDIE, case-based library, design, technology integration

Procedia PDF Downloads 447
8588 Aggregate Production Planning Framework in a Multi-Product Factory: A Case Study

Authors: Ignatio Madanhire, Charles Mbohwa

Abstract:

This study looks at the best model of aggregate planning activity in an industrial entity and uses the trial and error method on spreadsheets to solve aggregate production planning problems. Also linear programming model is introduced to optimize the aggregate production planning problem. Application of the models in a furniture production firm is evaluated to demonstrate that practical and beneficial solutions can be obtained from the models. Finally some benchmarking of other furniture manufacturing industries was undertaken to assess relevance and level of use in other furniture firms

Keywords: aggregate production planning, trial and error, linear programming, furniture industry

Procedia PDF Downloads 560
8587 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 123
8586 Enhancing Mental Health Services Through Strategic Planning: The East Tennessee State University Counseling Center’s 2024-2028 Plan

Authors: R. M. Kilonzo, S. Bedingfield, K. Smith, K. Hudgins Smith, K. Couper, R. Ratley, Z. Taylor, A. Engelman, M. Renne

Abstract:

Introduction: The mental health needs of university students continue to evolve, necessitating a strategic approach to service delivery. The East Tennessee State University (ETSU) Counseling Center developed its inaugural Strategic Plan (2024-2028) to enhance student mental health services. The plan focuses on improving access, quality of care, and service visibility, aligning with the university’s mission to support academic success and student well-being. Aim: This strategic plan aims to establish a comprehensive framework for delivering high-quality, evidence-based mental health services to ETSU students, addressing current challenges, and anticipating future needs. Methods: The development of the strategic plan was a collaborative effort involving the Counseling Center’s leadership, staff, with technical support from Doctor of Public Health-community and behavioral health intern. Multiple workshops, online/offline reviews, and stakeholder consultations were held to ensure a robust and inclusive process. A SWOT analysis and stakeholder mapping were conducted to identify strengths, weaknesses, opportunities, and challenges. Key performance indicators (KPIs) were set to measure service utilization, satisfaction, and outcomes. Results: The plan resulted in four strategic priorities: service application, visibility/accessibility, safety and satisfaction, and training programs. Key objectives include expanding counseling services, improving service access through outreach, reducing stigma, and increasing peer support programs. The plan also focuses on continuous quality improvement through data-driven assessments and research initiatives. Immediate outcomes include expanded group therapy, enhanced staff training, and increased mental health literacy across campus. Conclusion and Recommendation: The strategic plan provides a roadmap for addressing the mental health needs of ETSU students, with a clear focus on accessibility, inclusivity, and evidence-based practices. Implementing the plan will strengthen the Counseling Center’s capacity to meet the diverse needs of the student population. To ensure sustainability, it is recommended that the center continuously assess student needs, foster partnerships with university and external stakeholders, and advocate for increased funding to expand services and staff capacity.

Keywords: strategic plan, university counseling center, mental health, students

Procedia PDF Downloads 22
8585 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 143
8584 Future Student Service Organization - Road Map

Authors: Michael Postert

Abstract:

The Studierendenwerke are legally independent public foundations with a one-century-old history in the German university community. Like the French CROUS, the Italian ANDISU or the Japanese University COOPs, they are set-up to serve the university and student needs. They are legally independent of their client institutions and student stakeholders. Initially set up as a support organization by students for students they have evolved to public business institutions with an annual turnover of EUR 100 Million or more. They are usually engaged in business areas such as student housing, restaurants, student grants, governmental scholarships and counselling services. These institutions are facing major changes over the next few years. The COVID19 pandemic and its impact on the educational system will unavoidably have an immense impact on the German student service organizations (Studierendenwerke). Issues such as digitalization and sustainability will have a huge impact on how the future business model of the Studierendenwerke will look like. The paper will discuss the aims and challenges of this development that started already before the COVID19 pandemic. In light of the way the educational system of the future will look like, the Studierendenwerke have to develop as well.

Keywords: business model, digitalization, education, student services

Procedia PDF Downloads 234
8583 Using Machine Learning to Classify Different Body Parts and Determine Healthiness

Authors: Zachary Pan

Abstract:

Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.

Keywords: body part, healthcare, machine learning, neural networks

Procedia PDF Downloads 109
8582 Review of Hydrologic Applications of Conceptual Models for Precipitation-Runoff Process

Authors: Oluwatosin Olofintoye, Josiah Adeyemo, Gbemileke Shomade

Abstract:

The relationship between rainfall and runoff is an important issue in surface water hydrology therefore the understanding and development of accurate rainfall-runoff models and their applications in water resources planning, management and operation are of paramount importance in hydrological studies. This paper reviews some of the previous works on the rainfall-runoff process modeling. The hydrologic applications of conceptual models and artificial neural networks (ANNs) for the precipitation-runoff process modeling were studied. Gradient training methods such as error back-propagation (BP) and evolutionary algorithms (EAs) are discussed in relation to the training of artificial neural networks and it is shown that application of EAs to artificial neural networks training could be an alternative to other training methods. Therefore, further research interest to exploit the abundant expert knowledge in the area of artificial intelligence for the solution of hydrologic and water resources planning and management problems is needed.

Keywords: artificial intelligence, artificial neural networks, evolutionary algorithms, gradient training method, rainfall-runoff model

Procedia PDF Downloads 456
8581 The Effect of Symmetry on the Perception of Happiness and Boredom in Design Products

Authors: Michele Sinico

Abstract:

The present research investigates the effect of symmetry on the perception of happiness and boredom in design products. Three experiments were carried out in order to verify the degree of the visual expressive value on different models of bookcases, wall clocks, and chairs. 60 participants directly indicated the degree of happiness and boredom using 7-point rating scales. The findings show that the participants acknowledged a different value of expressive quality in the different product models. Results show also that symmetry is not a significant constraint for an emotional design project.

Keywords: product experience, emotional design, symmetry, expressive qualities

Procedia PDF Downloads 148
8580 Airliner-UAV Flight Formation in Climb Regime

Authors: Pavel Zikmund, Robert Popela

Abstract:

Extreme formation is a theoretical concept of self-sustain flight when a big Airliner is followed by a small UAV glider flying in airliner’s wake vortex. The paper presents results of climb analysis with a goal to lift the gliding UAV to airliner’s cruise altitude. Wake vortex models, the UAV drag polar and basic parameters and airliner’s climb profile are introduced at first. Then, flight performance of the UAV in the wake vortex is evaluated by analytical methods. Time history of optimal distance between the airliner and the UAV during the climb is determined. The results are encouraging, therefore available UAV drag margin for electricity generation is figured out for different vortex models.

Keywords: flight in formation, self-sustained flight, UAV, wake vortex

Procedia PDF Downloads 444
8579 New Approaches to Guest Engagement Leading to Guest Satisfaction and Driving Guest Loyalty

Authors: Vaibhav Garg

Abstract:

The concept of guest engagement lies in the rigor of operational excellence and the emotional engagement of associates who perform their tasks genuinely from their hearts and hence in word and indeed, in intent and through gestures, a great hospitality is always genuine, attentive, passionate, caring and warm where engaged associates deliver exceptional service experiences and creates memories to last forever for the guests. One out of every five guests says that their decision of coming back to the same hotel is influenced by the opportunity to “experience and be engaged” A key question is what does a guest mean by experience and be engaged? Most hotels are highly concerned about the guest satisfaction. Therefore they have the brand standards which are a guide to the associate to ensure consistent implementation of set service and product standards to satisfy a guest. However, satisfaction of basic guest needs does not necessarily lead to engagement. For example an absolutely clean room and an in room dining order delivered on time can satisfy a guest but may not engage him. Absence of these standards can certainly lead to guest dissatisfaction however; the presence of these standards does not necessarily lead to guest engagement or guest delight.

Keywords: guest engagement, guest satisfaction, hospitality, hotel operations, operational excellence

Procedia PDF Downloads 242
8578 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis

Authors: Chang-Jen Lan

Abstract:

Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as X

Keywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index

Procedia PDF Downloads 131
8577 Social Accountability: Persuasion and Debate to Contain Corruption

Authors: A. Lambert-Mogiliansky

Abstract:

In this paper, we investigate the properties of simple rules for reappointment aimed at holding a public official accountable and monitor his activity. The public official allocates budget resources to various activities which results in the delivery of public services to citizens. He has discretion over the use of resource so he can divert some of them for private ends. Because of a liability constraint, zero diversion can never be secured in all states. The optimal reappointment mechanism under complete information is shown to exhibit some leniency thus departing from the zero tolerance principle. Under asymmetric information (about the state), a rule with random verification in a pre-announced subset is shown to be optimal in a class of common rules. Surprisingly, those common rules make little use of hard information about service delivery when available. Similarly, PO's claim about his record is of no value to improve the performance of the examined rules. In contrast requesting that the PO defends his records publicly can be very useful if the service users are given the chance to refute false claims with cheap talk complaints: the first best complete information outcome can be approached in the absence of any observation by the manager of the accountability mechanism.

Keywords: accountability, corruption, persuasion, debate

Procedia PDF Downloads 382
8576 Probing Syntax Information in Word Representations with Deep Metric Learning

Authors: Bowen Ding, Yihao Kuang

Abstract:

In recent years, with the development of large-scale pre-trained lan-guage models, building vector representations of text through deep neural network models has become a standard practice for natural language processing tasks. From the performance on downstream tasks, we can know that the text representation constructed by these models contains linguistic information, but its encoding mode and extent are unclear. In this work, a structural probe is proposed to detect whether the vector representation produced by a deep neural network is embedded with a syntax tree. The probe is trained with the deep metric learning method, so that the distance between word vectors in the metric space it defines encodes the distance of words on the syntax tree, and the norm of word vectors encodes the depth of words on the syntax tree. The experiment results on ELMo and BERT show that the syntax tree is encoded in their parameters and the word representations they produce.

Keywords: deep metric learning, syntax tree probing, natural language processing, word representations

Procedia PDF Downloads 69
8575 Assessment of Access to Water, Sanitation and Hygiene, in Relation to the SDG 6, in Small Towns in Senegal: The Case of the Town of Foundiougne

Authors: Elhadji Mamadou Sonko, Ndiogou Sankhare, Jean Birane Gning, Cheikh Diop

Abstract:

In Senegal, small towns have problems of access to water, hygiene, and sanitation. This study aims to assess the situation in Foundiougne. The methodology includes a literature review, semi-structured interviews with stakeholders, surveys of 100 households, and observation. The results show that 35% of households have unimproved water services, 46% have limited service, and 19% have basic service. Regarding sanitation, 77% of households have basic sanitation services, and 23% have limited sanitation services. Manual emptying alone is practiced by 4% of households, while 17% combine it with mechanical emptying. Household wastewater is disposed of in streets, vacant land, and concession yards. The emptied sludge is discharged into the environment without treatment. Hand washing is practiced by 98% of households. These results show that there is real work to be done at the small towns level to close the water and sanitation gap in order to achieve SDG 6 targets in Senegal.

Keywords: foundiougne, SDG 6, senegal, small towns, water sanitation ang hygiene

Procedia PDF Downloads 129
8574 Prediction of Bodyweight of Cattle by Artificial Neural Networks Using Digital Images

Authors: Yalçın Bozkurt

Abstract:

Prediction models were developed for accurate prediction of bodyweight (BW) by using Digital Images of beef cattle body dimensions by Artificial Neural Networks (ANN). For this purpose, the animal data were collected at a private slaughter house and the digital images and the weights of each live animal were taken just before they were slaughtered and the body dimensions such as digital wither height (DJWH), digital body length (DJBL), digital body depth (DJBD), digital hip width (DJHW), digital hip height (DJHH) and digital pin bone length (DJPL) were determined from the images, using the data with 1069 observations for each traits. Then, prediction models were developed by ANN. Digital body measurements were analysed by ANN for body prediction and R2 values of DJBL, DJWH, DJHW, DJBD, DJHH and DJPL were approximately 94.32, 91.31, 80.70, 83.61, 89.45 and 70.56 % respectively. It can be concluded that in management situations where BW cannot be measured it can be predicted accurately by measuring DJBL and DJWH alone or both DJBD and even DJHH and different models may be needed to predict BW in different feeding and environmental conditions and breeds

Keywords: artificial neural networks, bodyweight, cattle, digital body measurements

Procedia PDF Downloads 375
8573 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques

Authors: Jonathan Iworiso

Abstract:

Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.

Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains

Procedia PDF Downloads 108
8572 Using Lean-Six Sigma Philosophy to Enhance Revenues and Improve Customer Satisfaction: Case Studies from Leading Telecommunications Service Providers in India

Authors: Senthil Kumar Anantharaman

Abstract:

Providing telecommunications based network services in developing countries like India which has a population of 1.5 billion people, so that these services reach every individual, is one of the greatest challenges the country has been facing in its journey towards economic growth and development. With growing number of telecommunications service providers in the country, a constant challenge that has been faced by these providers is in providing not only quality but also delightful customer experience while simultaneously generating enhanced revenues and profits. Thus, the role played by process improvement methodologies like Six Sigma cannot be undermined and specifically in telecom service provider based operations, it has provided substantial benefits. Therefore, it advantages are quite comparable to its applications and advantages in other sectors like manufacturing, financial services, information technology-based services and Healthcare services. One of the key reasons that this methodology has been able to reap great benefits in telecommunications sector is that this methodology has been combined with many of its competing process improvement techniques like Theory of Constraints, Lean and Kaizen to give the maximum benefit to the service providers thereby creating a winning combination of organized process improvement methods for operational excellence thereby leading to business excellence. This paper discusses about some of the key projects and areas in the end to end ‘Quote to Cash’ process at big three Indian telecommunication companies that have been highly assisted by applying Six Sigma along with other process improvement techniques. While the telecommunication companies which we have considered, is primarily in India and run by both private operators and government based setups, the methodology can be applied equally well in any other part of developing countries around the world having similar context. This study also compares the enhanced revenues that can arise out of appropriate opportunities in emerging market scenarios, that Six Sigma as a philosophy and methodology can provide if applied with vigour and robustness. Finally, the paper also comes out with a winning framework in combining Six Sigma methodology with Kaizen, Lean and Theory of Constraints that will enhance both the top-line as well as the bottom-line while providing the customers a delightful experience.

Keywords: emerging markets, lean, process improvement, six sigma, telecommunications, theory of constraints

Procedia PDF Downloads 164
8571 Point-of-Interest Recommender Systems for Location-Based Social Network Services

Authors: Hoyeon Park, Yunhwan Keon, Kyoung-Jae Kim

Abstract:

Location Based Social Network services (LBSNs) is a new term that combines location based service and social network service (SNS). Unlike traditional SNS, LBSNs emphasizes empirical elements in the user's actual physical location. Point-of-Interest (POI) is the most important factor to implement LBSNs recommendation system. POI information is the most popular spot in the area. In this study, we would like to recommend POI to users in a specific area through recommendation system using collaborative filtering. The process is as follows: first, we will use different data sets based on Seoul and New York to find interesting results on human behavior. Secondly, based on the location-based activity information obtained from the personalized LBSNs, we have devised a new rating that defines the user's preference for the area. Finally, we have developed an automated rating algorithm from massive raw data using distributed systems to reduce advertising costs of LBSNs.

Keywords: location-based social network services, point-of-interest, recommender systems, business analytics

Procedia PDF Downloads 229
8570 Fast Track to the Physical Internet: A Cross-Industry Project from Upper Austria

Authors: Laura Simmer, Maria Kalt, Oliver Schauer

Abstract:

Freight transport is growing fast, but many vehicles are empty or just partially loaded. The vision and concepts of the Physical Internet (PI) proposes to eliminate these inefficiencies. Aiming for a radical sustainability improvement, the PI – inspired by the Digital Internet – is a hyperconnected global logistic system, enabling seamless asset sharing and flow consolidation. The implementation of a PI in its full expression will be a huge challenge: the industry needs innovation and implementation support including change management approaches, awareness creation and good practices diffusion, legislative actions to remove antitrust and international commerce barriers, standardization and public incentives policies. In order to take a step closer to this future the project ‘Atropine - Fast Track to the Physical Internet’ funded under the Strategic Economic and Research Program ‘Innovative Upper Austria 2020’ was set up. The two-year research project unites several research partners in this field, but also industrial partners and logistics service providers. With Atropine, the consortium wants to actively shape the mobility landscape in Upper Austria and make an innovative contribution to an energy-efficient, environmentally sound and sustainable development in the transport area. This paper should, on the one hand, clarify the questions what the project Atropine is about and, on the other hand, how a proof of concept will be reached. Awareness building plays an important role in the project as the PI requires a reorganization of the supply chain and the design of completely new forms of inter-company co-operation. New business models have to be developed and should be verified by simulation. After the simulation process one of these business models will be chosen and tested in real life with the partner companies. The developed results - simulation model and demonstrator - are used to determine how the concept of the PI can be applied in Upper Austria. Atropine shall pave the way for a full-scale development of the PI vision in the next few decades and provide the basis for pushing the industry toward a new level of co-operation with more shared resources and increased standardization.

Keywords: Atropine, inter-company co-operation, Physical Internet, shared resources, sustainable logistics

Procedia PDF Downloads 224
8569 A Review of the Long Term Effects of In-Service Training Towards Inclusive Education

Authors: Meenakshi Srivastava, Anke A. De Boer, Sip Jan Pij

Abstract:

Teacher’s preparedness towards special educational needs (SEN) of the students in regular schools is an important factor in making education inclusive as a goal to provide education for all. The current study measured the long term effects of an in-service teacher training programme which focused on the inclusion of students with a range of SEN. The programme was on three particular aspects: teachers’ attitudes, their knowledge about SEN and knowledge about teaching methods. A refresher course was also organized for participants of the initial training programme. The long term effects were examined by teachers using a self-report questionnaire (n = 38). The wider effects of the initial training were recorded by interviewing school principals (n = 4). Repeated measures of ANOVA revealed significant effects: more positive attitudes and increased knowledge about SEN among teachers who took the refresher course (n = 18) compared to those who had not (n = 19). Principals also found a more positive attitude, sensitivity and increased awareness about SEN among the participants.

Keywords: inclusion, students with special educational needs, teacher training, follow-up, attitudes change

Procedia PDF Downloads 126
8568 Structure of Turbulence Flow in the Wire-Wrappes Fuel Assemblies of BREST-OD-300

Authors: Dmitry V. Fomichev, Vladimir I. Solonin

Abstract:

In this paper, experimental and numerical study of hydrodynamic characteristics of the air coolant flow in the test wire-wrapped assembly is presented. The test assembly has 37 rods, which are similar to the real fuel pins of the BREST-OD-300 fuel assemblies geometrically. Air open loop test facility installed at the “Nuclear Power Plants and Installations” department of BMSTU was used to obtain the experimental data. The obtaining altitudinal distribution of static pressure in the near-wall test assembly as well as velocity and temperature distribution of coolant flow in the test sections can give us some new knowledge about the mechanism of formation of the turbulence flow structure in the wire wrapped fuel assemblies. Numerical simulations of the turbulence flow has been accomplished using ANSYS Fluent 14.5. Different non-local turbulence models have been considered, such as standard and RNG k-e models and k-w SST model. Results of numerical simulations of the flow based on the considered turbulence models give the best agreement with the experimental data and help us to carry out strong analysis of flow characteristics.

Keywords: wire-spaces fuel assembly, turbulent flow structure, computation fluid dynamics

Procedia PDF Downloads 460
8567 The Relationship between Working Models and Psychological Safety

Authors: Rosyellen Rabelo Szvarça, Pedro Fialho, Auristela Duarte de Lima Moser

Abstract:

Background: New ways of working, such as teleworking or hybrid working, have changed and have impacted both employees and organizations. To understand the individuals' perceptions among different working models, this study aimed to investigate levels of psychological safety among employees working in person, hybrid, and remote environments and the correlation of demographic or professional characteristics. Methods: A cross-sectional survey was distributed electronically. A self-administered questionnaire was composed of sociodemographic data, academic status, professional contexts, working models, and the seven-item instrument of psychological safety. The psychological safety instrument was computed to determine its reliability, showing a Cronbach’s 0.75, considering a good scale when compared to the original, analyzed with 51 teams from a North American company, with a Cronbach's alpha coefficient of 0.82. Results: The survey was completed by 328 individuals, 60% of whom were in-person, 29.3% hybrid, and 10.7% remote. The Chi-Square test with the Bonferroni post-test for qualitative variables associated with the working models indicates a significant association (p 0.001) for academic qualifications. In-person models present 29.4% of individuals with secondary education and 38.1% undergraduate; hybrid present 51% postgraduate and 35.4% undergraduate. This was similar to remote workers, with 48.6% postgraduate and 34.3% undergraduate. There were no significant differences in gender composition between work models (p = 0.738), with most respondents being female in all three work groups. Remote workers predominated in areas such as commerce, marketing, and services; education and the public sector were common in the in-person group, while technology and the financial sector were predominant among hybrid workers (p < 0.001). As for leadership roles, there was no significant association with working models (p = 0.126). The decision on the working model was predominantly made by the organization for in-person and hybrid workers (p < 0.001). Preference for the working model was in line with the workers' scenario at that time (p < 0.001). Kruskal-Wallis test with Bonferroni's post hoc test compared the psychological safety scores between working groups, reveling statistically higher scores in hybrid group x̃ = 5.64 compared to in-person group x̃ = 5, with remote workers showing scores similar to other groups x̃ = 5.43 (p = 0.004). Age demonstrated no significant difference between the working groups (p = 0.052). On the other hand, organization tenure and job tenure were higher in in-person groups compared to the hybrid and remote groups (p < 0.001). The hybrid model illustrates a balance between in-person and remote models. The results highlight that higher levels of psychological safety can be correlated with the flexibility of hybrid work, as well as physical interaction, spontaneity, and informal relationships, which are considered determinants of high levels of psychological safety. Conclusions: Psychological safety at the group level using the seven-item scale is widely employed in comparison to other commonly employed measures. Despite psychological safety having been around for decades, primarily studied in in-person work contexts, the current findings contribute to expanding research with hybrid or remote settings. Ultimately, this investigation has demonstrated the significance of work models in assessing psychological safety levels.

Keywords: hybrid work, new ways of working, psychological safety, workplace, working models

Procedia PDF Downloads 12
8566 Advances in Design Decision Support Tools for Early-stage Energy-Efficient Architectural Design: A Review

Authors: Maryam Mohammadi, Mohammadjavad Mahdavinejad, Mojtaba Ansari

Abstract:

The main driving force for increasing movement towards the design of High-Performance Buildings (HPB) are building codes and rating systems that address the various components of the building and their impact on the environment and energy conservation through various methods like prescriptive methods or simulation-based approaches. The methods and tools developed to meet these needs, which are often based on building performance simulation tools (BPST), have limitations in terms of compatibility with the integrated design process (IDP) and HPB design, as well as use by architects in the early stages of design (when the most important decisions are made). To overcome these limitations in recent years, efforts have been made to develop Design Decision Support Systems, which are often based on artificial intelligence. Numerous needs and steps for designing and developing a Decision Support System (DSS), which complies with the early stages of energy-efficient architecture design -consisting of combinations of different methods in an integrated package- have been listed in the literature. While various review studies have been conducted in connection with each of these techniques (such as optimizations, sensitivity and uncertainty analysis, etc.) and their integration of them with specific targets; this article is a critical and holistic review of the researches which leads to the development of applicable systems or introduction of a comprehensive framework for developing models complies with the IDP. Information resources such as Science Direct and Google Scholar are searched using specific keywords and the results are divided into two main categories: Simulation-based DSSs and Meta-simulation-based DSSs. The strengths and limitations of different models are highlighted, two general conceptual models are introduced for each category and the degree of compliance of these models with the IDP Framework is discussed. The research shows movement towards Multi-Level of Development (MOD) models, well combined with early stages of integrated design (schematic design stage and design development stage), which are heuristic, hybrid and Meta-simulation-based, relies on Big-real Data (like Building Energy Management Systems Data or Web data). Obtaining, using and combining of these data with simulation data to create models with higher uncertainty, more dynamic and more sensitive to context and culture models, as well as models that can generate economy-energy-efficient design scenarios using local data (to be more harmonized with circular economy principles), are important research areas in this field. The results of this study are a roadmap for researchers and developers of these tools.

Keywords: integrated design process, design decision support system, meta-simulation based, early stage, big data, energy efficiency

Procedia PDF Downloads 162
8565 A Framework of Virtualized Software Controller for Smart Manufacturing

Authors: Pin Xiu Chen, Shang Liang Chen

Abstract:

A virtualized software controller is developed in this research to replace traditional hardware control units. This virtualized software controller transfers motion interpolation calculations from the motion control units of end devices to edge computing platforms, thereby reducing the end devices' computational load and hardware requirements and making maintenance and updates easier. The study also applies the concept of microservices, dividing the control system into several small functional modules and then deploy into a cloud data server. This reduces the interdependency among modules and enhances the overall system's flexibility and scalability. Finally, with containerization technology, the system can be deployed and started in a matter of seconds, which is more efficient than traditional virtual machine deployment methods. Furthermore, this virtualized software controller communicates with end control devices via wireless networks, making the placement of production equipment or the redesign of processes more flexible and no longer limited by physical wiring. To handle the large data flow and maintain low-latency transmission, this study integrates 5G technology, fully utilizing its high speed, wide bandwidth, and low latency features to achieve rapid and stable remote machine control. An experimental setup is designed to verify the feasibility and test the performance of this framework. This study designs a smart manufacturing site with a 5G communication architecture, serving as a field for experimental data collection and performance testing. The smart manufacturing site includes one robotic arm, three Computer Numerical Control machine tools, several Input/Output ports, and an edge computing architecture. All machinery information is uploaded to edge computing servers and cloud servers via 5G communication and the Internet of Things framework. After analysis and computation, this information is converted into motion control commands, which are transmitted back to the relevant machinery for motion control through 5G communication. The communication time intervals at each stage are calculated using the C++ chrono library to measure the time difference for each command transmission. The relevant test results will be organized and displayed in the full-text.

Keywords: 5G, MEC, microservices, virtualized software controller, smart manufacturing

Procedia PDF Downloads 84
8564 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 87