Search results for: non-normal data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24201

Search results for: non-normal data

21531 Groundwater Monitoring Using a Community: Science Approach

Authors: Shobha Kumari Yadav, Yubaraj Satyal, Ajaya Dixit

Abstract:

In addressing groundwater depletion, it is important to develop evidence base so to be used in assessing the state of its degradation. Groundwater data is limited compared to meteorological data, which impedes the groundwater use and management plan. Monitoring of groundwater levels provides information base to assess the condition of aquifers, their responses to water extraction, land-use change, and climatic variability. It is important to maintain a network of spatially distributed, long-term monitoring wells to support groundwater management plan. Monitoring involving local community is a cost effective approach that generates real time data to effectively manage groundwater use. This paper presents the relationship between rainfall and spring flow, which are the main source of freshwater for drinking, household consumptions and agriculture in hills of Nepal. The supply and withdrawal of water from springs depends upon local hydrology and the meteorological characteristics- such as rainfall, evapotranspiration and interflow. The study offers evidence of the use of scientific method and community based initiative for managing groundwater and springshed. The approach presents a method to replicate similar initiative in other parts of the country for maintaining integrity of springs.

Keywords: citizen science, groundwater, water resource management, Nepal

Procedia PDF Downloads 178
21530 A Dynamic Spatial Panel Data Analysis on Renter-Occupied Multifamily Housing DC

Authors: Jose Funes, Jeff Sauer, Laixiang Sun

Abstract:

This research examines determinants of multifamily housing development and spillovers in the District of Columbia. A range of socioeconomic factors related to income distribution, productivity, and land use policies are thought to influence the development in contemporary U.S. multifamily housing markets. The analysis leverages data from the American Community Survey to construct panel datasets spanning from 2010 to 2019. Using spatial regression, we identify several socioeconomic measures and land use policies both positively and negatively associated with new housing supply. We contextualize housing estimates related to race in relation to uneven development in the contemporary D.C. housing supply.

Keywords: neighborhood effect, sorting, spatial spillovers, multifamily housing

Procedia PDF Downloads 63
21529 Potentials and Challenges of Implementing Participatory Irrigation Management, Tanzania

Authors: Pilly Joseph Kagosi

Abstract:

The study aims at assessing challenges observed during implementation of participatory irrigation management (PIM) approach for food security in semi-arid areas of Tanzania. Data were collected through questionnaire, PRA tools, key informants discussion, Focus Group Discussion (FGD), participant observation and literature review. Data collected from questionnaire was analyzed using SPSS while PRA data was analyzed with the help of local communities during PRA exercise. Data from other methods were analyzed using content analysis. The study revealed that PIM approach has contribution in improved food security at household level due to involvement of communities in water management activities and decision making which enhanced availability of water for irrigation and increased crop production. However there were challenges observed during implementation of the approach including; minimum participation of beneficiaries in decision making during planning and designing stages, meaning inadequate devolution of power among scheme owners; Inadequate and lack of transparency on income expenditure in Water Utilization Associations’ (WUAs), water conflict among WUAs members, conflict between farmers and livestock keepers and conflict between WUAs leaders and village government regarding training opportunities and status; WUAs rules and regulation are not legally recognized by the National court and few farmers involved in planting trees around water sources. However it was realized that some of the mentioned challenges were rectified by farmers themselves facilitated by government officials. The study recommends that, the identified challenges need to be rectified for farmers to realize impotence of PIM approach as it was realized by other Asian countries.

Keywords: potentials of implementing participatory approach, challenges of participatory approach, irrigation management, Tanzania

Procedia PDF Downloads 279
21528 Judicial Analysis of the Burden of Proof on the Perpetrator of Corruption Criminal Act

Authors: Rahmayanti, Theresia Simatupang, Ronald H. Sianturi

Abstract:

Corruption criminal act develops rapidly since in the transition era there is weakness in law. Consequently, there is an opportunity for a few people to do fraud and illegal acts and to misuse their positions and formal functions in order to make them rich, and the criminal acts are done systematically and sophisticatedly. Some people believe that legal provisions which specifically regulate the corruption criminal act; namely, Law No. 31/1999 in conjunction with Law No. 20/2001 on the Eradication of Corruption Criminal Act are not effective any more, especially in onus probandi (the burden of proof) on corruptors. The research was a descriptive analysis, a research method which is used to obtain description on a certain situation or condition by explaining the data, and the conclusion is drawn through some analyses. The research used judicial normative approach since it used secondary data as the main data by conducting library research. The system of the burden of proof, which follows the principles of reversal of the burden of proof stipulated in Article 12B, paragraph 1 a and b, Article 37A, and Article 38B of Law No. 20/2001 on the Amendment of Law No. 31/1999, is used only as supporting evidence when the principal case is proved. Meanwhile, how to maximize the implementation of the burden of proof on the perpetrators of corruption criminal act in which the public prosecutor brings a corruption case to Court, depends upon the nature of the case and the type of indictment. The system of burden of proof can be used to eradicate corruption in the Court if some policies and general principles of justice such as independency, impartiality, and legal certainty, are applied.

Keywords: burden of proof, perpetrator, corruption criminal act

Procedia PDF Downloads 291
21527 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana

Authors: Gautier Viaud, Paul-Henry Cournède

Abstract:

Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.

Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models

Procedia PDF Downloads 281
21526 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media

Procedia PDF Downloads 75
21525 Earthquake Risk Assessment Using Out-of-Sequence Thrust Movement

Authors: Rajkumar Ghosh

Abstract:

Earthquakes are natural disasters that pose a significant risk to human life and infrastructure. Effective earthquake mitigation measures require a thorough understanding of the dynamics of seismic occurrences, including thrust movement. Traditionally, estimating thrust movement has relied on typical techniques that may not capture the full complexity of these events. Therefore, investigating alternative approaches, such as incorporating out-of-sequence thrust movement data, could enhance earthquake mitigation strategies. This review aims to provide an overview of the applications of out-of-sequence thrust movement in earthquake mitigation. By examining existing research and studies, the objective is to understand how precise estimation of thrust movement can contribute to improving structural design, analyzing infrastructure risk, and developing early warning systems. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources, including GPS measurements, satellite imagery, and seismic recordings. By analyzing and synthesizing these diverse datasets, researchers can gain a more comprehensive understanding of thrust movement dynamics during seismic occurrences. The review identifies potential advantages of incorporating out-of-sequence data in earthquake mitigation techniques. These include improving the efficiency of structural design, enhancing infrastructure risk analysis, and developing more accurate early warning systems. By considering out-of-sequence thrust movement estimates, researchers and policymakers can make informed decisions to mitigate the impact of earthquakes. This study contributes to the field of seismic monitoring and earthquake risk assessment by highlighting the benefits of incorporating out-of-sequence thrust movement data. By broadening the scope of analysis beyond traditional techniques, researchers can enhance their knowledge of earthquake dynamics and improve the effectiveness of mitigation measures. The study collects data from various sources, including GPS measurements, satellite imagery, and seismic recordings. These datasets are then analyzed using appropriate statistical and computational techniques to estimate out-of-sequence thrust movement. The review integrates findings from multiple studies to provide a comprehensive assessment of the topic. The study concludes that incorporating out-of-sequence thrust movement data can significantly enhance earthquake mitigation measures. By utilizing diverse data sources, researchers and policymakers can gain a more comprehensive understanding of seismic dynamics and make informed decisions. However, challenges exist, such as data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and improve the accuracy of estimates, further research and advancements in methodology are recommended. Overall, this review serves as a valuable resource for researchers, engineers, and policymakers involved in earthquake mitigation, as it encourages the development of innovative strategies based on a better understanding of thrust movement dynamics.

Keywords: earthquake, out-of-sequence thrust, disaster, human life

Procedia PDF Downloads 46
21524 Different Sampling Schemes for Semi-Parametric Frailty Model

Authors: Nursel Koyuncu, Nihal Ata Tutkun

Abstract:

Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.

Keywords: frailty model, ranked set sampling, efficiency, simple random sampling

Procedia PDF Downloads 187
21523 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm

Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar

Abstract:

The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations

Procedia PDF Downloads 393
21522 Testing the Life Cycle Theory on the Capital Structure Dynamics of Trade-Off and Pecking Order Theories: A Case of Retail, Industrial and Mining Sectors

Authors: Freddy Munzhelele

Abstract:

Setting: the empirical research has shown that the life cycle theory has an impact on the firms’ financing decisions, particularly the dividend pay-outs. Accordingly, the life cycle theory posits that as a firm matures, it gets to a level and capacity where it distributes more cash as dividends. On the other hand, the young firms prioritise investment opportunities sets and their financing; thus, they pay little or no dividends. The research on firms’ financing decisions also demonstrated, among others, the adoption of trade-off and pecking order theories on the dynamics of firms capital structure. The trade-off theory talks to firms holding a favourable position regarding debt structures particularly as to the cost and benefits thereof; and pecking order is concerned with firms preferring a hierarchical order as to choosing financing sources. The case of life cycle hypothesis explaining the financial managers’ decisions as regards the firms’ capital structure dynamics appears to be an interesting link, yet this link has been neglected in corporate finance research. If this link is to be explored as an empirical research, the financial decision-making alternatives will be enhanced immensely, since no conclusive evidence has been found yet as to the dynamics of capital structure. Aim: the aim of this study is to examine the impact of life cycle theory on the capital structure dynamics trade-off and pecking order theories of firms listed in retail, industrial and mining sectors of the JSE. These sectors are among the key contributors to the GDP in the South African economy. Design and methodology: following the postpositivist research paradigm, the study is quantitative in nature and utilises secondary data obtainable from the financial statements of sampled firm for the period 2010 – 2022. The firms’ financial statements will be extracted from the IRESS database. Since the data will be in panel form, a combination of the static and dynamic panel data estimators will used to analyse data. The overall data analyses will be done using STATA program. Value add: this study directly investigates the link between the life cycle theory and the dynamics of capital structure decisions, particularly the trade-off and pecking order theories.

Keywords: life cycle theory, trade-off theory, pecking order theory, capital structure, JSE listed firms

Procedia PDF Downloads 40
21521 Neural Synchronization - The Brain’s Transfer of Sensory Data

Authors: David Edgar

Abstract:

To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.

Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)

Procedia PDF Downloads 104
21520 The Impact of Electronic Commerce on Organisational Efectiveness: A Study of Zenith Bank Plc

Authors: Olusola Abiodun Arinde

Abstract:

This research work was prompted by the very important role e-commerce plays in every organization, be it private or public. The underlying objective of this study is to have a critical appraisal of the extent to which e-commerce impacts on organizational effectiveness. This research was carried out using Zenith Bank Plc as a case study. Relevant data were collected through structured questionnaire, oral interview, journals, newspapers, and textbooks. The data collected were analyzed and hypotheses were tested. Based on the result of the hypotheses, it was observed that e-commerce is significant to every organization. Through e-commerce, fast services delivery would be guaranteed to customers, this would lead to higher productivity and profit for organizations. E-commerce should be managed in such a way that it does not alienate customers; it should also prevent enormous risks that are associated with e-commerce.

Keywords: e-commerce, fast service, productivity, profit

Procedia PDF Downloads 218
21519 Actionable Personalised Learning Strategies to Improve a Growth-Mindset in an Educational Setting Using Artificial Intelligence

Authors: Garry Gorman, Nigel McKelvey, James Connolly

Abstract:

This study will evaluate a growth mindset intervention with Junior Cycle Coding and Senior Cycle Computer Science students in Ireland, where gamification will be used to incentivise growth mindset behaviour. An artificial intelligence (AI) driven personalised learning system will be developed to present computer programming learning tasks in a manner that is best suited to the individuals’ own learning preferences while incentivising and rewarding growth mindset behaviour of persistence, mastery response to challenge, and challenge seeking. This research endeavours to measure mindset with before and after surveys (conducted nationally) and by recording growth mindset behaviour whilst playing a digital game. This study will harness the capabilities of AI and aims to determine how a personalised learning (PL) experience can impact the mindset of a broad range of students. The focus of this study will be to determine how personalising the learning experience influences female and disadvantaged students' sense of belonging in the computer science classroom when tasks are presented in a manner that is best suited to the individual. Whole Brain Learning will underpin this research and will be used as a framework to guide the research in identifying key areas such as thinking and learning styles, cognitive potential, motivators and fears, and emotional intelligence. This research will be conducted in multiple school types over one academic year. Digital games will be played multiple times over this period, and the data gathered will be used to inform the AI algorithm. The three data sets are described as follows: (i) Before and after survey data to determine the grit scores and mindsets of the participants, (ii) The Growth Mind-Set data from the game, which will measure multiple growth mindset behaviours, such as persistence, response to challenge and use of strategy, (iii) The AI data to guide PL. This study will highlight the effectiveness of an AI-driven personalised learning experience. The data will position AI within the Irish educational landscape, with a specific focus on the teaching of CS. These findings will benefit coding and computer science teachers by providing a clear pedagogy for the effective delivery of personalised learning strategies for computer science education. This pedagogy will help prevent students from developing a fixed mindset while helping pupils to exhibit persistence of effort, use of strategy, and a mastery response to challenges.

Keywords: computer science education, artificial intelligence, growth mindset, pedagogy

Procedia PDF Downloads 66
21518 Theoretical Study of the Structural and Elastic Properties of Semiconducting Rare Earth Chalcogenide Sm1-XEuXS under Pressure

Authors: R. Dubey, M. Sarwan, S. Singh

Abstract:

We have investigated the phase transition pressure and associated volume collapse in Sm1– X EuX S alloy (0≤x≤1) which shows transition from discontinuous to continuous as x is reduced. The calculated results from present approach are in good agreement with experimental data available for the end point members (x=0 and x=1). The results for the alloy counter parts are also in fair agreement with experimental data generated from the vegard’s law. An improved interaction potential model has been developed which includes coulomb, three body interaction, polarizability effect and overlap repulsive interaction operative up to second neighbor ions. It is found that the inclusion of polarizability effect has improved our results.

Keywords: elastic constants, high pressure, phase transition, rare earth compound

Procedia PDF Downloads 396
21517 Revolutionizing Healthcare Facility Maintenance: A Groundbreaking AI, BIM, and IoT Integration Framework

Authors: Mina Sadat Orooje, Mohammad Mehdi Latifi, Behnam Fereydooni Eftekhari

Abstract:

The integration of cutting-edge Internet of Things (IoT) technologies with advanced Artificial Intelligence (AI) systems is revolutionizing healthcare facility management. However, the current landscape of hospital building maintenance suffers from slow, repetitive, and disjointed processes, leading to significant financial, resource, and time losses. Additionally, the potential of Building Information Modeling (BIM) in facility maintenance is hindered by a lack of data within digital models of built environments, necessitating a more streamlined data collection process. This paper presents a robust framework that harmonizes AI with BIM-IoT technology to elevate healthcare Facility Maintenance Management (FMM) and address these pressing challenges. The methodology begins with a thorough literature review and requirements analysis, providing insights into existing technological landscapes and associated obstacles. Extensive data collection and analysis efforts follow to deepen understanding of hospital infrastructure and maintenance records. Critical AI algorithms are identified to address predictive maintenance, anomaly detection, and optimization needs alongside integration strategies for BIM and IoT technologies, enabling real-time data collection and analysis. The framework outlines protocols for data processing, analysis, and decision-making. A prototype implementation is executed to showcase the framework's functionality, followed by a rigorous validation process to evaluate its efficacy and gather user feedback. Refinement and optimization steps are then undertaken based on evaluation outcomes. Emphasis is placed on the scalability of the framework in real-world scenarios and its potential applications across diverse healthcare facility contexts. Finally, the findings are meticulously documented and shared within the healthcare and facility management communities. This framework aims to significantly boost maintenance efficiency, cut costs, provide decision support, enable real-time monitoring, offer data-driven insights, and ultimately enhance patient safety and satisfaction. By tackling current challenges in healthcare facility maintenance management it paves the way for the adoption of smarter and more efficient maintenance practices in healthcare facilities.

Keywords: artificial intelligence, building information modeling, healthcare facility maintenance, internet of things integration, maintenance efficiency

Procedia PDF Downloads 24
21516 Predicting Personality and Psychological Distress Using Natural Language Processing

Authors: Jihee Jang, Seowon Yoon, Gaeun Son, Minjung Kang, Joon Yeon Choeh, Kee-Hong Choi

Abstract:

Background: Self-report multiple choice questionnaires have been widely utilized to quantitatively measure one’s personality and psychological constructs. Despite several strengths (e.g., brevity and utility), self-report multiple-choice questionnaires have considerable limitations in nature. With the rise of machine learning (ML) and Natural language processing (NLP), researchers in the field of psychology are widely adopting NLP to assess psychological constructs to predict human behaviors. However, there is a lack of connections between the work being performed in computer science and that psychology due to small data sets and unvalidated modeling practices. Aims: The current article introduces the study method and procedure of phase II, which includes the interview questions for the five-factor model (FFM) of personality developed in phase I. This study aims to develop the interview (semi-structured) and open-ended questions for the FFM-based personality assessments, specifically designed with experts in the field of clinical and personality psychology (phase 1), and to collect the personality-related text data using the interview questions and self-report measures on personality and psychological distress (phase 2). The purpose of the study includes examining the relationship between natural language data obtained from the interview questions, measuring the FFM personality constructs, and psychological distress to demonstrate the validity of the natural language-based personality prediction. Methods: The phase I (pilot) study was conducted on fifty-nine native Korean adults to acquire the personality-related text data from the interview (semi-structured) and open-ended questions based on the FFM of personality. The interview questions were revised and finalized with the feedback from the external expert committee, consisting of personality and clinical psychologists. Based on the established interview questions, a total of 425 Korean adults were recruited using a convenience sampling method via an online survey. The text data collected from interviews were analyzed using natural language processing. The results of the online survey, including demographic data, depression, anxiety, and personality inventories, were analyzed together in the model to predict individuals’ FFM of personality and the level of psychological distress (phase 2).

Keywords: personality prediction, psychological distress prediction, natural language processing, machine learning, the five-factor model of personality

Procedia PDF Downloads 56
21515 Suitable Site Selection of Small Dams Using Geo-Spatial Technique: A Case Study of Dadu Tehsil, Sindh

Authors: Zahid Khalil, Saad Ul Haque, Asif Khan

Abstract:

Decision making about identifying suitable sites for any project by considering different parameters is difficult. Using GIS and Multi-Criteria Analysis (MCA) can make it easy for those projects. This technology has proved to be an efficient and adequate in acquiring the desired information. In this study, GIS and MCA were employed to identify the suitable sites for small dams in Dadu Tehsil, Sindh. The GIS software is used to create all the spatial parameters for the analysis. The parameters that derived are slope, drainage density, rainfall, land use / land cover, soil groups, Curve Number (CN) and runoff index with a spatial resolution of 30m. The data used for deriving above layers include 30-meter resolution SRTM DEM, Landsat 8 imagery, and rainfall from National Centre of Environment Prediction (NCEP) and soil data from World Harmonized Soil Data (WHSD). Land use/Land cover map is derived from Landsat 8 using supervised classification. Slope, drainage network and watershed are delineated by terrain processing of DEM. The Soil Conservation Services (SCS) method is implemented to estimate the surface runoff from the rainfall. Prior to this, SCS-CN grid is developed by integrating the soil and land use/land cover raster. These layers with some technical and ecological constraints are assigned weights on the basis of suitability criteria. The pairwise comparison method, also known as Analytical Hierarchy Process (AHP) is taken into account as MCA for assigning weights on each decision element. All the parameters and group of parameters are integrated using weighted overlay in GIS environment to produce suitable sites for the Dams. The resultant layer is then classified into four classes namely, best suitable, suitable, moderate and less suitable. This study reveals a contribution to decision-making about suitable sites analysis for small dams using geospatial data with minimal amount of ground data. This suitability maps can be helpful for water resource management organizations in determination of feasible rainwater harvesting structures (RWH).

Keywords: Remote sensing, GIS, AHP, RWH

Procedia PDF Downloads 364
21514 Travel Behavior Simulation of Bike-Sharing System Users in Kaoshiung City

Authors: Hong-Yi Lin, Feng-Tyan Lin

Abstract:

In a Bike-sharing system (BSS), users can easily rent bikes from any station in the city for mid-range or short-range trips. BSS can also be integrated with other types of transport system, especially Green Transportation system, such as rail transport, bus etc. Since BSS records time and place of each pickup and return, the operational data can reflect more authentic and dynamic state of user behaviors. Furthermore, land uses around docking stations are highly associated with origins and destinations for the BSS users. As urban researchers, what concerns us more is to take BSS into consideration during the urban planning process and enhance the quality of urban life. This research focuses on the simulation of travel behavior of BSS users in Kaohsiung. First, rules of users’ behavior were derived by analyzing operational data and land use patterns nearby docking stations. Then, integrating with Monte Carlo method, these rules were embedded into a travel behavior simulation model, which was implemented by NetLogo, an agent-based modeling tool. The simulation model allows us to foresee the rent-return behaviour of BSS in order to choose potential locations of the docking stations. Also, it can provide insights and recommendations about planning and policies for the future BSS.

Keywords: agent-based model, bike-sharing system, BSS operational data, simulation

Procedia PDF Downloads 292
21513 Identification of Rainfall Trends in Qatar

Authors: Abdullah Al Mamoon, Ataur Rahman

Abstract:

Due to climate change, future rainfall will change at many locations on earth; however, the spatial and temporal patterns of this change are not easy to predict. One approach of predicting such future changes is to examine the trends in the historical rainfall data at a given region and use the identified trends to make future prediction. For this, a statistical trend test is commonly applied to the historical data. This paper examines the trends of daily extreme rainfall events from 30 rain gauges located in the State of Qatar. Rainfall data covering from 1962 to 2011 were used in the analysis. A combination of four non-parametric and parametric tests was applied to identify trends at 10%, 5%, and 1% significance levels. These tests are Mann-Kendall (MK), Spearman’s Rho (SR), Linear Regression (LR) and CUSUM tests. These tests showed both positive and negative trends throughout the country. Only eight stations showed positive (upward) trend, which were however not statistically significant. In contrast, significant negative (downward) trends were found at the 5% and 10% levels of significance in six stations. The MK, SR and LR tests exhibited very similar results. This finding has important implications in the derivation/upgrade of design rainfall for Qatar, which will affect design and operation of future urban drainage infrastructure in Qatar.

Keywords: trends, extreme rainfall, daily rainfall, Mann-Kendall test, climate change, Qatar

Procedia PDF Downloads 530
21512 The Operating Results of the English General Music Course on the Education Platform

Authors: Shan-Ken Chine

Abstract:

This research aims to a one-year course run of String Music Appreciation, an international online course launched on the British open education platform. It explains how to present music teaching videos with three main features. They are music lesson explanations, instrumental playing demonstrations, and live music performances. The plan of this course is with four major themes and a total of 97 steps. In addition, the paper also uses the testing data provided by the education platform to analyze the performance of learners and to understand the operation of the course. It contains three test data in the statistics dashboard. They are course-run measures, total statistics, and statistics by week. The paper ends with a review of the course's star rating in this one-year run. The result of this course run will be adjusted when it starts again in the future.

Keywords: music online courses, MOOCs, ubiquitous learning, string music, general music education

Procedia PDF Downloads 7
21511 Impact of Graduates’ Quality of Education and Research on ICT Adoption at Workplace

Authors: Mohammed Kafaji

Abstract:

This paper aims to investigate the influence of quality of education and quality of research, provided by local educational institutions, on the adoption of Information and Communication Technology (ICT) in managing business operations for companies in Saudi market. A model was developed and tested using data collected from 138 CEO’s of foreign companies in diverse business sectors. The data is analysed and managed using multivariate approaches through standard statistical packages. The results showed that educational quality has little contribution to the ICT adoption while research quality seems to play a more prominent role. These results are analysed in terms of business environment and market constraints and further extended to the perceived effectiveness of applied pedagogical approaches in schools and universities.

Keywords: quality of education, quality of research, mediation, domestic competition, ICT adoption

Procedia PDF Downloads 431
21510 Evaluation of the Role of Advocacy and the Quality of Care in Reducing Health Inequalities for People with Autism, Intellectual and Developmental Disabilities at Sheffield Teaching Hospitals

Authors: Jonathan Sahu, Jill Aylott

Abstract:

Individuals with Autism, Intellectual and Developmental disabilities (AIDD) are one of the most vulnerable groups in society, hampered not only by their own limitations to understand and interact with the wider society, but also societal limitations in perception and understanding. Communication to express their needs and wishes is fundamental to enable such individuals to live and prosper in society. This research project was designed as an organisational case study, in a large secondary health care hospital within the National Health Service (NHS), to assess the quality of care provided to people with AIDD and to review the role of advocacy to reduce health inequalities in these individuals. Methods: The research methodology adopted was as an “insider researcher”. Data collection included both quantitative and qualitative data i.e. a mixed method approach. A semi-structured interview schedule was designed and used to obtain qualitative and quantitative primary data from a wide range of interdisciplinary frontline health care workers to assess their understanding and awareness of systems, processes and evidence based practice to offer a quality service to people with AIDD. Secondary data were obtained from sources within the organisation, in keeping with “Case Study” as a primary method, and organisational performance data were then compared against national benchmarking standards. Further data sources were accessed to help evaluate the effectiveness of different types of advocacy that were present in the organisation. This was gauged by measures of user and carer experience in the form of retrospective survey analysis, incidents and complaints. Results: Secondary data demonstrate near compliance of the Organisation with the current national benchmarking standard (Monitor Compliance Framework). However, primary data demonstrate poor knowledge of the Mental Capacity Act 2005, poor knowledge of organisational systems, processes and evidence based practice applied for people with AIDD. In addition there was poor knowledge and awareness of frontline health care workers of advocacy and advocacy schemes for this group. Conclusions: A significant amount of work needs to be undertaken to improve the quality of care delivered to individuals with AIDD. An operational strategy promoting the widespread dissemination of information may not be the best approach to deliver quality care and optimal patient experience and patient advocacy. In addition, a more robust set of standards, with appropriate metrics, needs to be developed to assess organisational performance which will stand the test of professional and public scrutiny.

Keywords: advocacy, autism, health inequalities, intellectual developmental disabilities, quality of care

Procedia PDF Downloads 194
21509 Exploring the Development of Communicative Skills in English Teaching Students: A Phenomenological Study During Online Instruction

Authors: Estephanie S. López Contreras, Vicente Aranda Palacios, Daniela Flores Silva, Felipe Oliveros Olivares, Romina Riquelme Escobedo, Iñaki Westerhout Usabiaga

Abstract:

This research explored whether the context of online instruction has influenced the development of first-year English-teaching students' communication skills, being these speaking and listening. The theoretical basis finds its niche in the need to bridge the gap in knowledge about the Chilean online educational context and the development of English communicative skills. An interpretative paradigm and a phenomenological design were implemented in this study. Twenty- two first-year students and two teachers from an English teaching training program participated in the study. The students' ages ranged from 18 to 26 years of age, and the teachers' years of experience ranged from 5 to 13 years in the program. For data collection purposes, semi- structured interviews were applied to both students and teachers. Interview questions were based on the initial conceptualization of the central phenomenon. Observations, field notes, and focus groups with the students are also part of the data collection process. Data analysis considered two-cycle methods. The first included descriptive coding for field notes, initial coding for interviews, and creating a codebook. The second cycle included axial coding for both field notes and interviews. After data analysis, the findings show that students perceived online classes as instances in which active communication cannot always occur. In addition, changes made to the curricula as a consequence of the COVID-19 pandemic have affected students' speaking and listening skills.

Keywords: attitudes, communicative skills, EFL teaching training program, online instruction, and perceptions

Procedia PDF Downloads 91
21508 Collaborative Online Learning for Lecturers

Authors: Lee Bih Ni, Emily Doreen Lee, Wee Hui Yean

Abstract:

This paper was prepared to see the perceptions of online lectures regarding collaborative learning, in terms of how lecturers view online collaborative learning in the higher learning institution. The purpose of this study was conducted to determine the perceptions of online lectures about collaborative learning, especially how lecturers see online collaborative learning in the university. Adult learning education enhance collaborative learning culture with the target of involving learners in the learning process to make teaching and learning more effective and open at the university. This will finally make students learning that will assist each other. It is also to cut down the pressure of loneliness and isolation might felt among adult learners. Their ways in collaborative online was also determined. In this paper, researchers collect data using questionnaires instruments. The collected data were analyzed and interpreted. By analyzing the data, researchers report the results according the proof taken from the respondents. Results from the study, it is not only dependent on the lecturer but also a student to shape a good collaborative learning practice. Rational concepts and pattern to achieve these targets be clear right from the beginning and may be good seen by a number of proposals submitted and include how the higher learning institution has trained with ongoing lectures online. Advantages of online collaborative learning show that lecturers should be trained effectively. Studies have seen that the lecturer aware of online collaborative learning. This positive attitude will encourage the higher learning institution to continue to give the knowledge and skills required.

Keywords: collaborative online learning, lecturers’ training, learning, online

Procedia PDF Downloads 432
21507 Multi-Level Clustering Based Congestion Control Protocol for Cyber Physical Systems

Authors: Manpreet Kaur, Amita Rani, Sanjay Kumar

Abstract:

The Internet of Things (IoT), a cyber-physical paradigm, allows a large number of devices to connect and send the sensory data in the network simultaneously. This tremendous amount of data generated leads to very high network load consequently resulting in network congestion. It further amounts to frequent loss of useful information and depletion of significant amount of nodes’ energy. Therefore, there is a need to control congestion in IoT so as to prolong network lifetime and improve the quality of service (QoS). Hence, we propose a two-level clustering based routing algorithm considering congestion score and packet priority metrics that focus on minimizing the network congestion. In the proposed Priority based Congestion Control (PBCC) protocol the sensor nodes in IoT network form clusters that reduces the amount of traffic and the nodes are prioritized to emphasize important data. Simultaneously, a congestion score determines the occurrence of congestion at a particular node. The proposed protocol outperforms the existing Packet Discard Network Clustering (PDNC) protocol in terms of buffer size, packet transmission range, network region and number of nodes, under various simulation scenarios.

Keywords: internet of things, cyber-physical systems, congestion control, priority, transmission rate

Procedia PDF Downloads 284
21506 Correlation Matrix for Automatic Identification of Meal-Taking Activity

Authors: Ghazi Bouaziz, Abderrahim Derouiche, Damien Brulin, Hélène Pigot, Eric Campo

Abstract:

Automatic ADL classification is a crucial part of ambient assisted living technologies. It allows to monitor the daily life of the elderly and to detect any changes in their behavior that could be related to health problem. But detection of ADLs is a challenge, especially because each person has his/her own rhythm for performing them. Therefore, we used a correlation matrix to extract custom rules that enable to detect ADLs, including eating activity. Data collected from 3 different individuals between 35 and 105 days allows the extraction of personalized eating patterns. The comparison of the results of the process of eating activity extracted from the correlation matrices with the declarative data collected during the survey shows an accuracy of 90%.

Keywords: elderly monitoring, ADL identification, matrix correlation, meal-taking activity

Procedia PDF Downloads 64
21505 Digital Twin for Retail Store Security

Authors: Rishi Agarwal

Abstract:

Digital twins are emerging as a strong technology used to imitate and monitor physical objects digitally in real time across sectors. It is not only dealing with the digital space, but it is also actuating responses in the physical space in response to the digital space processing like storage, modeling, learning, simulation, and prediction. This paper explores the application of digital twins for enhancing physical security in retail stores. The retail sector still relies on outdated physical security practices like manual monitoring and metal detectors, which are insufficient for modern needs. There is a lack of real-time data and system integration, leading to ineffective emergency response and preventative measures. As retail automation increases, new digital frameworks must control safety without human intervention. To address this, the paper proposes implementing an intelligent digital twin framework. This collects diverse data streams from in-store sensors, surveillance, external sources, and customer devices and then Advanced analytics and simulations enable real-time monitoring, incident prediction, automated emergency procedures, and stakeholder coordination. Overall, the digital twin improves physical security through automation, adaptability, and comprehensive data sharing. The paper also analyzes the pros and cons of implementation of this technology through an Emerging Technology Analysis Canvas that analyzes different aspects of this technology through both narrow and wide lenses to help decision makers in their decision of implementing this technology. On a broader scale, this showcases the value of digital twins in transforming legacy systems across sectors and how data sharing can create a safer world for both retail store customers and owners.

Keywords: digital twin, retail store safety, digital twin in retail, digital twin for physical safety

Procedia PDF Downloads 45
21504 Determinant Factor Analysis of Foreign Direct Investment in Asean-6 Countries Period 2004-2012

Authors: Eleonora Sofilda, Ria Amalia, Muhammad Zilal Hamzah

Abstract:

Foreign direct investment is one of the sources of financing or capital that important for a country, especially for developing countries. This investment also provides a great contribution to development through the transfer of assets, management improving, and transfer of technology in enhancing the economy of a country. In the other side currently in ASEAN countries emerge the interesting phenomenon where some big producers are re-locate their basic production among those countries. This research is aimed to analyze the factors that affect capital inflows of foreign direct investment into the 6 ASEAN countries (Indonesia, Malaysia, Singapore, Thailand, Philippines, and Vietnam) in period 2004-2012. This study uses panel data analysis to determine the factors that affect of foreign direct investment in 6 ASEAN. The factors that affect of foreign direct investment (FDI) are the gross domestic product (GDP), global competitiveness (GCI), interest rate, exchange rate and trade openness (TO). Result of panel data analysis show that three independent variables (GCI, GDP, and TO) have a significant effect to the FDI in 6 ASEAN Countries.

Keywords: foreign direct investment, the gross domestic product, global competitiveness, interest rate, exchange rate, trade openness, panel data analysis

Procedia PDF Downloads 443
21503 Assessment of Dimensions and Gully Recovery With GPS Receiver and RPA (Drone)

Authors: Mariana Roberta Ribeiro, Isabela de Cássia Caramello, Roberto Saverio Souza Costa

Abstract:

Currently, one of the most important environmental problems is soil degradation. This wear is the result of inadequate agricultural practices, with water erosion as the main agent. As the runoff water is concentrated in certain points, it can reach a more advanced stage, which are the gullies. In view of this, the objective of this work was to evaluate which methodology is most suitable for the purpose of elaborating a project for the recovery of a gully, relating work time, data reliability, and the final cost. The work was carried out on a rural road in Monte Alto - SP, where there is 0.30 hectares of area under the influence of a gully. For the evaluation, an aerophotogrammetric survey was used with RPA, with georeferenced points, and with a GNSS L1/L2 receiver. To assess the importance of georeferenced points, there was a comparison of altimetric data using the support points with altimetric data using only the aircraft's internal GPS. Another method used was the survey by conventional topography, where coordinates were collected by total station and L1/L2 Geodetic GPS receiver. Statistical analysis was performed using analysis of variance (ANOVA) using the F test (p<0.05), and the means between treatments were compared using the Tukey test (p<0.05). The results showed that the surveys carried out by aerial photogrammetry and by conventional topography showed no significant difference for the analyzed parameters. Considering the data presented, it is possible to conclude that, when comparing the parameters of accuracy, the final volume of the gully, and cost, for the purpose of elaborating a project for the recovery of a gully, the methodologies of aerial photogrammetric survey and conventional topography do not differ significantly. However, when working time, use of labor, and project detail are compared, the aerial photogrammetric survey proves to be more viable.

Keywords: drones, erosion, soil conservation, technology in agriculture

Procedia PDF Downloads 85
21502 Novel GPU Approach in Predicting the Directional Trend of the S&P500

Authors: A. J. Regan, F. J. Lidgey, M. Betteridge, P. Georgiou, C. Toumazou, K. Hayatleh, J. R. Dibble

Abstract:

Our goal is development of an algorithm capable of predicting the directional trend of the Standard and Poor’s 500 index (S&P 500). Extensive research has been published attempting to predict different financial markets using historical data testing on an in-sample and trend basis, with many authors employing excessively complex mathematical techniques. In reviewing and evaluating these in-sample methodologies, it became evident that this approach was unable to achieve sufficiently reliable prediction performance for commercial exploitation. For these reasons, we moved to an out-of-sample strategy based on linear regression analysis of an extensive set of financial data correlated with historical closing prices of the S&P 500. We are pleased to report a directional trend accuracy of greater than 55% for tomorrow (t+1) in predicting the S&P 500.

Keywords: financial algorithm, GPU, S&P 500, stock market prediction

Procedia PDF Downloads 327