Search results for: three dimensional data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26677

Search results for: three dimensional data acquisition

21337 Potentials and Challenges of Implementing Participatory Irrigation Management, Tanzania

Authors: Pilly Joseph Kagosi

Abstract:

The study aims at assessing challenges observed during implementation of participatory irrigation management (PIM) approach for food security in semi-arid areas of Tanzania. Data were collected through questionnaire, PRA tools, key informants discussion, Focus Group Discussion (FGD), participant observation and literature review. Data collected from questionnaire was analyzed using SPSS while PRA data was analyzed with the help of local communities during PRA exercise. Data from other methods were analyzed using content analysis. The study revealed that PIM approach has contribution in improved food security at household level due to involvement of communities in water management activities and decision making which enhanced availability of water for irrigation and increased crop production. However there were challenges observed during implementation of the approach including; minimum participation of beneficiaries in decision making during planning and designing stages, meaning inadequate devolution of power among scheme owners; Inadequate and lack of transparency on income expenditure in Water Utilization Associations’ (WUAs), water conflict among WUAs members, conflict between farmers and livestock keepers and conflict between WUAs leaders and village government regarding training opportunities and status; WUAs rules and regulation are not legally recognized by the National court and few farmers involved in planting trees around water sources. However it was realized that some of the mentioned challenges were rectified by farmers themselves facilitated by government officials. The study recommends that, the identified challenges need to be rectified for farmers to realize impotence of PIM approach as it was realized by other Asian countries.

Keywords: potentials of implementing participatory approach, challenges of participatory approach, irrigation management, Tanzania

Procedia PDF Downloads 296
21336 Synthesis of Low-Cost Porous Silicon Carbide Foams from Renewable Sources

Authors: M. A. Bayona, E. M. Cordoba, V. R. Guiza

Abstract:

Highly porous carbon-based foams are used in a wide range of industrial applications, which include absorption, catalyst supports, thermal insulation, and biomaterials, among others. Particularly, silicon carbide (SiC) based foams have shown exceptional potential for catalyst support applications, due to their chemical inertness, large frontal area, low resistance to flow, low-pressure drop, as well as high resistance to temperature and corrosion. These properties allow the use of SiC foams in harsh environments with high durability. Commonly, SiC foams are fabricated from polysiloxane, SiC powders and phenolic resins, which can be costly or highly toxic to the environment. In this work, we propose a low-cost method for the fabrication of highly porous, three-dimensional SiC foams via template replica, using recycled polymeric sponges as sacrificial templates. A sucrose-based resin combined with a Si-containing pre-ceramic polymer was used as the precursor. Polymeric templates were impregnated with the precursor solution, followed by thermal treatment at 1500 °C under an inert atmosphere. Several synthesis parameters, such as viscosity and composition of the precursor solution (Si: Sucrose molar ratio), and the porosity of the template, were evaluated in terms of their effect on the morphology, composition and mechanical resistance of the resulting SiC foams. The synthesized composite foams exhibited a highly porous (50-90%) and interconnected structure, containing 30-90% SiC with a mechanical compressive strength between 0.01-0.1 MPa. The methodology employed here allowed the fabrication of foams with a varied concentration of SiC and with morphological and mechanical properties that contribute to the development of materials of high relevance in the industry, while using low-cost, renewable sources such as table sugar, and providing a recycling alternative for polymeric sponges.

Keywords: catalyst support, polymer replica technique, reticulated porous ceramics, silicon carbide

Procedia PDF Downloads 116
21335 Judicial Analysis of the Burden of Proof on the Perpetrator of Corruption Criminal Act

Authors: Rahmayanti, Theresia Simatupang, Ronald H. Sianturi

Abstract:

Corruption criminal act develops rapidly since in the transition era there is weakness in law. Consequently, there is an opportunity for a few people to do fraud and illegal acts and to misuse their positions and formal functions in order to make them rich, and the criminal acts are done systematically and sophisticatedly. Some people believe that legal provisions which specifically regulate the corruption criminal act; namely, Law No. 31/1999 in conjunction with Law No. 20/2001 on the Eradication of Corruption Criminal Act are not effective any more, especially in onus probandi (the burden of proof) on corruptors. The research was a descriptive analysis, a research method which is used to obtain description on a certain situation or condition by explaining the data, and the conclusion is drawn through some analyses. The research used judicial normative approach since it used secondary data as the main data by conducting library research. The system of the burden of proof, which follows the principles of reversal of the burden of proof stipulated in Article 12B, paragraph 1 a and b, Article 37A, and Article 38B of Law No. 20/2001 on the Amendment of Law No. 31/1999, is used only as supporting evidence when the principal case is proved. Meanwhile, how to maximize the implementation of the burden of proof on the perpetrators of corruption criminal act in which the public prosecutor brings a corruption case to Court, depends upon the nature of the case and the type of indictment. The system of burden of proof can be used to eradicate corruption in the Court if some policies and general principles of justice such as independency, impartiality, and legal certainty, are applied.

Keywords: burden of proof, perpetrator, corruption criminal act

Procedia PDF Downloads 314
21334 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana

Authors: Gautier Viaud, Paul-Henry Cournède

Abstract:

Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.

Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models

Procedia PDF Downloads 294
21333 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media

Procedia PDF Downloads 99
21332 Passing-On Cultural Heritage Knowledge: Entrepreneurial Approaches for a Higher Educational Sustainability

Authors: Ioana Simina Frincu

Abstract:

As institutional initiatives often fail to provide good practices when it comes to heritage management or to adapt to the changing environment in which they function and to the audiences they address, private actions represent viable strategies for sustainable knowledge acquisition. Information dissemination to future generations is one of the key aspects in preserving cultural heritage and is successfully feasible even in the absence of original artifacts. Combined with the (re)discovery of natural landscape, open-air exploratory approaches (archeoparks) versus an enclosed monodisciplinary rigid framework (traditional museums) are more likely to 'speak the language' of a larger number of people, belonging to a variety of categories, ages, and professions. Interactive sites are efficient ways of stimulating heritage awareness and increasing the number of visitors of non-interactive/static cultural institutions owning original pieces of history, delivering specialized information, and making continuous efforts to preserve historical evidence (relics, manuscripts, etc.). It is high time entrepreneurs took over the role of promoting cultural heritage, bet it under a more commercial yet more attractive form (business). Inclusive, participatory type of activities conceived by experts from different domains/fields (history, anthropology, tourism, sociology, business management, integrative sustainability, etc.) have better chances to ensure long term cultural benefits for both adults and children, especially when and where the educational discourse fails. These unique self-experience leisure activities, which offer everyone the opportunity to recreate history by him-/her-self, to relive the ancestors’ way of living, surviving and exploring should be regarded not as pseudo-scientific approaches but as important pre-steps to museum experiences. In order to support this theory, focus will be laid on two different examples: one dynamic, in the outdoors (the Boario Terme Archeopark from Italy) and one experimental, held indoor (the reconstruction of the Neolithic sanctuary of Parta, Romania as part of a transdisciplinary academic course) and their impact on young generations. The conclusion of this study shows that the increasingly lower engagement of youth (students) in discovering and understanding history, archaeology, and heritage can be revived by entrepreneurial projects.

Keywords: archeopark, educational tourism, open air museum, Parta sanctuary, prehistory

Procedia PDF Downloads 127
21331 Earthquake Risk Assessment Using Out-of-Sequence Thrust Movement

Authors: Rajkumar Ghosh

Abstract:

Earthquakes are natural disasters that pose a significant risk to human life and infrastructure. Effective earthquake mitigation measures require a thorough understanding of the dynamics of seismic occurrences, including thrust movement. Traditionally, estimating thrust movement has relied on typical techniques that may not capture the full complexity of these events. Therefore, investigating alternative approaches, such as incorporating out-of-sequence thrust movement data, could enhance earthquake mitigation strategies. This review aims to provide an overview of the applications of out-of-sequence thrust movement in earthquake mitigation. By examining existing research and studies, the objective is to understand how precise estimation of thrust movement can contribute to improving structural design, analyzing infrastructure risk, and developing early warning systems. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources, including GPS measurements, satellite imagery, and seismic recordings. By analyzing and synthesizing these diverse datasets, researchers can gain a more comprehensive understanding of thrust movement dynamics during seismic occurrences. The review identifies potential advantages of incorporating out-of-sequence data in earthquake mitigation techniques. These include improving the efficiency of structural design, enhancing infrastructure risk analysis, and developing more accurate early warning systems. By considering out-of-sequence thrust movement estimates, researchers and policymakers can make informed decisions to mitigate the impact of earthquakes. This study contributes to the field of seismic monitoring and earthquake risk assessment by highlighting the benefits of incorporating out-of-sequence thrust movement data. By broadening the scope of analysis beyond traditional techniques, researchers can enhance their knowledge of earthquake dynamics and improve the effectiveness of mitigation measures. The study collects data from various sources, including GPS measurements, satellite imagery, and seismic recordings. These datasets are then analyzed using appropriate statistical and computational techniques to estimate out-of-sequence thrust movement. The review integrates findings from multiple studies to provide a comprehensive assessment of the topic. The study concludes that incorporating out-of-sequence thrust movement data can significantly enhance earthquake mitigation measures. By utilizing diverse data sources, researchers and policymakers can gain a more comprehensive understanding of seismic dynamics and make informed decisions. However, challenges exist, such as data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and improve the accuracy of estimates, further research and advancements in methodology are recommended. Overall, this review serves as a valuable resource for researchers, engineers, and policymakers involved in earthquake mitigation, as it encourages the development of innovative strategies based on a better understanding of thrust movement dynamics.

Keywords: earthquake, out-of-sequence thrust, disaster, human life

Procedia PDF Downloads 66
21330 Different Sampling Schemes for Semi-Parametric Frailty Model

Authors: Nursel Koyuncu, Nihal Ata Tutkun

Abstract:

Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.

Keywords: frailty model, ranked set sampling, efficiency, simple random sampling

Procedia PDF Downloads 202
21329 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm

Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar

Abstract:

The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations

Procedia PDF Downloads 408
21328 Testing the Life Cycle Theory on the Capital Structure Dynamics of Trade-Off and Pecking Order Theories: A Case of Retail, Industrial and Mining Sectors

Authors: Freddy Munzhelele

Abstract:

Setting: the empirical research has shown that the life cycle theory has an impact on the firms’ financing decisions, particularly the dividend pay-outs. Accordingly, the life cycle theory posits that as a firm matures, it gets to a level and capacity where it distributes more cash as dividends. On the other hand, the young firms prioritise investment opportunities sets and their financing; thus, they pay little or no dividends. The research on firms’ financing decisions also demonstrated, among others, the adoption of trade-off and pecking order theories on the dynamics of firms capital structure. The trade-off theory talks to firms holding a favourable position regarding debt structures particularly as to the cost and benefits thereof; and pecking order is concerned with firms preferring a hierarchical order as to choosing financing sources. The case of life cycle hypothesis explaining the financial managers’ decisions as regards the firms’ capital structure dynamics appears to be an interesting link, yet this link has been neglected in corporate finance research. If this link is to be explored as an empirical research, the financial decision-making alternatives will be enhanced immensely, since no conclusive evidence has been found yet as to the dynamics of capital structure. Aim: the aim of this study is to examine the impact of life cycle theory on the capital structure dynamics trade-off and pecking order theories of firms listed in retail, industrial and mining sectors of the JSE. These sectors are among the key contributors to the GDP in the South African economy. Design and methodology: following the postpositivist research paradigm, the study is quantitative in nature and utilises secondary data obtainable from the financial statements of sampled firm for the period 2010 – 2022. The firms’ financial statements will be extracted from the IRESS database. Since the data will be in panel form, a combination of the static and dynamic panel data estimators will used to analyse data. The overall data analyses will be done using STATA program. Value add: this study directly investigates the link between the life cycle theory and the dynamics of capital structure decisions, particularly the trade-off and pecking order theories.

Keywords: life cycle theory, trade-off theory, pecking order theory, capital structure, JSE listed firms

Procedia PDF Downloads 51
21327 Neural Synchronization - The Brain’s Transfer of Sensory Data

Authors: David Edgar

Abstract:

To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.

Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)

Procedia PDF Downloads 118
21326 The Impact of Electronic Commerce on Organisational Efectiveness: A Study of Zenith Bank Plc

Authors: Olusola Abiodun Arinde

Abstract:

This research work was prompted by the very important role e-commerce plays in every organization, be it private or public. The underlying objective of this study is to have a critical appraisal of the extent to which e-commerce impacts on organizational effectiveness. This research was carried out using Zenith Bank Plc as a case study. Relevant data were collected through structured questionnaire, oral interview, journals, newspapers, and textbooks. The data collected were analyzed and hypotheses were tested. Based on the result of the hypotheses, it was observed that e-commerce is significant to every organization. Through e-commerce, fast services delivery would be guaranteed to customers, this would lead to higher productivity and profit for organizations. E-commerce should be managed in such a way that it does not alienate customers; it should also prevent enormous risks that are associated with e-commerce.

Keywords: e-commerce, fast service, productivity, profit

Procedia PDF Downloads 230
21325 Actionable Personalised Learning Strategies to Improve a Growth-Mindset in an Educational Setting Using Artificial Intelligence

Authors: Garry Gorman, Nigel McKelvey, James Connolly

Abstract:

This study will evaluate a growth mindset intervention with Junior Cycle Coding and Senior Cycle Computer Science students in Ireland, where gamification will be used to incentivise growth mindset behaviour. An artificial intelligence (AI) driven personalised learning system will be developed to present computer programming learning tasks in a manner that is best suited to the individuals’ own learning preferences while incentivising and rewarding growth mindset behaviour of persistence, mastery response to challenge, and challenge seeking. This research endeavours to measure mindset with before and after surveys (conducted nationally) and by recording growth mindset behaviour whilst playing a digital game. This study will harness the capabilities of AI and aims to determine how a personalised learning (PL) experience can impact the mindset of a broad range of students. The focus of this study will be to determine how personalising the learning experience influences female and disadvantaged students' sense of belonging in the computer science classroom when tasks are presented in a manner that is best suited to the individual. Whole Brain Learning will underpin this research and will be used as a framework to guide the research in identifying key areas such as thinking and learning styles, cognitive potential, motivators and fears, and emotional intelligence. This research will be conducted in multiple school types over one academic year. Digital games will be played multiple times over this period, and the data gathered will be used to inform the AI algorithm. The three data sets are described as follows: (i) Before and after survey data to determine the grit scores and mindsets of the participants, (ii) The Growth Mind-Set data from the game, which will measure multiple growth mindset behaviours, such as persistence, response to challenge and use of strategy, (iii) The AI data to guide PL. This study will highlight the effectiveness of an AI-driven personalised learning experience. The data will position AI within the Irish educational landscape, with a specific focus on the teaching of CS. These findings will benefit coding and computer science teachers by providing a clear pedagogy for the effective delivery of personalised learning strategies for computer science education. This pedagogy will help prevent students from developing a fixed mindset while helping pupils to exhibit persistence of effort, use of strategy, and a mastery response to challenges.

Keywords: computer science education, artificial intelligence, growth mindset, pedagogy

Procedia PDF Downloads 81
21324 Theoretical Study of the Structural and Elastic Properties of Semiconducting Rare Earth Chalcogenide Sm1-XEuXS under Pressure

Authors: R. Dubey, M. Sarwan, S. Singh

Abstract:

We have investigated the phase transition pressure and associated volume collapse in Sm1– X EuX S alloy (0≤x≤1) which shows transition from discontinuous to continuous as x is reduced. The calculated results from present approach are in good agreement with experimental data available for the end point members (x=0 and x=1). The results for the alloy counter parts are also in fair agreement with experimental data generated from the vegard’s law. An improved interaction potential model has been developed which includes coulomb, three body interaction, polarizability effect and overlap repulsive interaction operative up to second neighbor ions. It is found that the inclusion of polarizability effect has improved our results.

Keywords: elastic constants, high pressure, phase transition, rare earth compound

Procedia PDF Downloads 411
21323 Revolutionizing Healthcare Facility Maintenance: A Groundbreaking AI, BIM, and IoT Integration Framework

Authors: Mina Sadat Orooje, Mohammad Mehdi Latifi, Behnam Fereydooni Eftekhari

Abstract:

The integration of cutting-edge Internet of Things (IoT) technologies with advanced Artificial Intelligence (AI) systems is revolutionizing healthcare facility management. However, the current landscape of hospital building maintenance suffers from slow, repetitive, and disjointed processes, leading to significant financial, resource, and time losses. Additionally, the potential of Building Information Modeling (BIM) in facility maintenance is hindered by a lack of data within digital models of built environments, necessitating a more streamlined data collection process. This paper presents a robust framework that harmonizes AI with BIM-IoT technology to elevate healthcare Facility Maintenance Management (FMM) and address these pressing challenges. The methodology begins with a thorough literature review and requirements analysis, providing insights into existing technological landscapes and associated obstacles. Extensive data collection and analysis efforts follow to deepen understanding of hospital infrastructure and maintenance records. Critical AI algorithms are identified to address predictive maintenance, anomaly detection, and optimization needs alongside integration strategies for BIM and IoT technologies, enabling real-time data collection and analysis. The framework outlines protocols for data processing, analysis, and decision-making. A prototype implementation is executed to showcase the framework's functionality, followed by a rigorous validation process to evaluate its efficacy and gather user feedback. Refinement and optimization steps are then undertaken based on evaluation outcomes. Emphasis is placed on the scalability of the framework in real-world scenarios and its potential applications across diverse healthcare facility contexts. Finally, the findings are meticulously documented and shared within the healthcare and facility management communities. This framework aims to significantly boost maintenance efficiency, cut costs, provide decision support, enable real-time monitoring, offer data-driven insights, and ultimately enhance patient safety and satisfaction. By tackling current challenges in healthcare facility maintenance management it paves the way for the adoption of smarter and more efficient maintenance practices in healthcare facilities.

Keywords: artificial intelligence, building information modeling, healthcare facility maintenance, internet of things integration, maintenance efficiency

Procedia PDF Downloads 44
21322 Predicting Personality and Psychological Distress Using Natural Language Processing

Authors: Jihee Jang, Seowon Yoon, Gaeun Son, Minjung Kang, Joon Yeon Choeh, Kee-Hong Choi

Abstract:

Background: Self-report multiple choice questionnaires have been widely utilized to quantitatively measure one’s personality and psychological constructs. Despite several strengths (e.g., brevity and utility), self-report multiple-choice questionnaires have considerable limitations in nature. With the rise of machine learning (ML) and Natural language processing (NLP), researchers in the field of psychology are widely adopting NLP to assess psychological constructs to predict human behaviors. However, there is a lack of connections between the work being performed in computer science and that psychology due to small data sets and unvalidated modeling practices. Aims: The current article introduces the study method and procedure of phase II, which includes the interview questions for the five-factor model (FFM) of personality developed in phase I. This study aims to develop the interview (semi-structured) and open-ended questions for the FFM-based personality assessments, specifically designed with experts in the field of clinical and personality psychology (phase 1), and to collect the personality-related text data using the interview questions and self-report measures on personality and psychological distress (phase 2). The purpose of the study includes examining the relationship between natural language data obtained from the interview questions, measuring the FFM personality constructs, and psychological distress to demonstrate the validity of the natural language-based personality prediction. Methods: The phase I (pilot) study was conducted on fifty-nine native Korean adults to acquire the personality-related text data from the interview (semi-structured) and open-ended questions based on the FFM of personality. The interview questions were revised and finalized with the feedback from the external expert committee, consisting of personality and clinical psychologists. Based on the established interview questions, a total of 425 Korean adults were recruited using a convenience sampling method via an online survey. The text data collected from interviews were analyzed using natural language processing. The results of the online survey, including demographic data, depression, anxiety, and personality inventories, were analyzed together in the model to predict individuals’ FFM of personality and the level of psychological distress (phase 2).

Keywords: personality prediction, psychological distress prediction, natural language processing, machine learning, the five-factor model of personality

Procedia PDF Downloads 74
21321 Suitable Site Selection of Small Dams Using Geo-Spatial Technique: A Case Study of Dadu Tehsil, Sindh

Authors: Zahid Khalil, Saad Ul Haque, Asif Khan

Abstract:

Decision making about identifying suitable sites for any project by considering different parameters is difficult. Using GIS and Multi-Criteria Analysis (MCA) can make it easy for those projects. This technology has proved to be an efficient and adequate in acquiring the desired information. In this study, GIS and MCA were employed to identify the suitable sites for small dams in Dadu Tehsil, Sindh. The GIS software is used to create all the spatial parameters for the analysis. The parameters that derived are slope, drainage density, rainfall, land use / land cover, soil groups, Curve Number (CN) and runoff index with a spatial resolution of 30m. The data used for deriving above layers include 30-meter resolution SRTM DEM, Landsat 8 imagery, and rainfall from National Centre of Environment Prediction (NCEP) and soil data from World Harmonized Soil Data (WHSD). Land use/Land cover map is derived from Landsat 8 using supervised classification. Slope, drainage network and watershed are delineated by terrain processing of DEM. The Soil Conservation Services (SCS) method is implemented to estimate the surface runoff from the rainfall. Prior to this, SCS-CN grid is developed by integrating the soil and land use/land cover raster. These layers with some technical and ecological constraints are assigned weights on the basis of suitability criteria. The pairwise comparison method, also known as Analytical Hierarchy Process (AHP) is taken into account as MCA for assigning weights on each decision element. All the parameters and group of parameters are integrated using weighted overlay in GIS environment to produce suitable sites for the Dams. The resultant layer is then classified into four classes namely, best suitable, suitable, moderate and less suitable. This study reveals a contribution to decision-making about suitable sites analysis for small dams using geospatial data with minimal amount of ground data. This suitability maps can be helpful for water resource management organizations in determination of feasible rainwater harvesting structures (RWH).

Keywords: Remote sensing, GIS, AHP, RWH

Procedia PDF Downloads 374
21320 Travel Behavior Simulation of Bike-Sharing System Users in Kaoshiung City

Authors: Hong-Yi Lin, Feng-Tyan Lin

Abstract:

In a Bike-sharing system (BSS), users can easily rent bikes from any station in the city for mid-range or short-range trips. BSS can also be integrated with other types of transport system, especially Green Transportation system, such as rail transport, bus etc. Since BSS records time and place of each pickup and return, the operational data can reflect more authentic and dynamic state of user behaviors. Furthermore, land uses around docking stations are highly associated with origins and destinations for the BSS users. As urban researchers, what concerns us more is to take BSS into consideration during the urban planning process and enhance the quality of urban life. This research focuses on the simulation of travel behavior of BSS users in Kaohsiung. First, rules of users’ behavior were derived by analyzing operational data and land use patterns nearby docking stations. Then, integrating with Monte Carlo method, these rules were embedded into a travel behavior simulation model, which was implemented by NetLogo, an agent-based modeling tool. The simulation model allows us to foresee the rent-return behaviour of BSS in order to choose potential locations of the docking stations. Also, it can provide insights and recommendations about planning and policies for the future BSS.

Keywords: agent-based model, bike-sharing system, BSS operational data, simulation

Procedia PDF Downloads 320
21319 Identification of Rainfall Trends in Qatar

Authors: Abdullah Al Mamoon, Ataur Rahman

Abstract:

Due to climate change, future rainfall will change at many locations on earth; however, the spatial and temporal patterns of this change are not easy to predict. One approach of predicting such future changes is to examine the trends in the historical rainfall data at a given region and use the identified trends to make future prediction. For this, a statistical trend test is commonly applied to the historical data. This paper examines the trends of daily extreme rainfall events from 30 rain gauges located in the State of Qatar. Rainfall data covering from 1962 to 2011 were used in the analysis. A combination of four non-parametric and parametric tests was applied to identify trends at 10%, 5%, and 1% significance levels. These tests are Mann-Kendall (MK), Spearman’s Rho (SR), Linear Regression (LR) and CUSUM tests. These tests showed both positive and negative trends throughout the country. Only eight stations showed positive (upward) trend, which were however not statistically significant. In contrast, significant negative (downward) trends were found at the 5% and 10% levels of significance in six stations. The MK, SR and LR tests exhibited very similar results. This finding has important implications in the derivation/upgrade of design rainfall for Qatar, which will affect design and operation of future urban drainage infrastructure in Qatar.

Keywords: trends, extreme rainfall, daily rainfall, Mann-Kendall test, climate change, Qatar

Procedia PDF Downloads 551
21318 The Operating Results of the English General Music Course on the Education Platform

Authors: Shan-Ken Chine

Abstract:

This research aims to a one-year course run of String Music Appreciation, an international online course launched on the British open education platform. It explains how to present music teaching videos with three main features. They are music lesson explanations, instrumental playing demonstrations, and live music performances. The plan of this course is with four major themes and a total of 97 steps. In addition, the paper also uses the testing data provided by the education platform to analyze the performance of learners and to understand the operation of the course. It contains three test data in the statistics dashboard. They are course-run measures, total statistics, and statistics by week. The paper ends with a review of the course's star rating in this one-year run. The result of this course run will be adjusted when it starts again in the future.

Keywords: music online courses, MOOCs, ubiquitous learning, string music, general music education

Procedia PDF Downloads 30
21317 Impact of Graduates’ Quality of Education and Research on ICT Adoption at Workplace

Authors: Mohammed Kafaji

Abstract:

This paper aims to investigate the influence of quality of education and quality of research, provided by local educational institutions, on the adoption of Information and Communication Technology (ICT) in managing business operations for companies in Saudi market. A model was developed and tested using data collected from 138 CEO’s of foreign companies in diverse business sectors. The data is analysed and managed using multivariate approaches through standard statistical packages. The results showed that educational quality has little contribution to the ICT adoption while research quality seems to play a more prominent role. These results are analysed in terms of business environment and market constraints and further extended to the perceived effectiveness of applied pedagogical approaches in schools and universities.

Keywords: quality of education, quality of research, mediation, domestic competition, ICT adoption

Procedia PDF Downloads 444
21316 Evaluation of the Role of Advocacy and the Quality of Care in Reducing Health Inequalities for People with Autism, Intellectual and Developmental Disabilities at Sheffield Teaching Hospitals

Authors: Jonathan Sahu, Jill Aylott

Abstract:

Individuals with Autism, Intellectual and Developmental disabilities (AIDD) are one of the most vulnerable groups in society, hampered not only by their own limitations to understand and interact with the wider society, but also societal limitations in perception and understanding. Communication to express their needs and wishes is fundamental to enable such individuals to live and prosper in society. This research project was designed as an organisational case study, in a large secondary health care hospital within the National Health Service (NHS), to assess the quality of care provided to people with AIDD and to review the role of advocacy to reduce health inequalities in these individuals. Methods: The research methodology adopted was as an “insider researcher”. Data collection included both quantitative and qualitative data i.e. a mixed method approach. A semi-structured interview schedule was designed and used to obtain qualitative and quantitative primary data from a wide range of interdisciplinary frontline health care workers to assess their understanding and awareness of systems, processes and evidence based practice to offer a quality service to people with AIDD. Secondary data were obtained from sources within the organisation, in keeping with “Case Study” as a primary method, and organisational performance data were then compared against national benchmarking standards. Further data sources were accessed to help evaluate the effectiveness of different types of advocacy that were present in the organisation. This was gauged by measures of user and carer experience in the form of retrospective survey analysis, incidents and complaints. Results: Secondary data demonstrate near compliance of the Organisation with the current national benchmarking standard (Monitor Compliance Framework). However, primary data demonstrate poor knowledge of the Mental Capacity Act 2005, poor knowledge of organisational systems, processes and evidence based practice applied for people with AIDD. In addition there was poor knowledge and awareness of frontline health care workers of advocacy and advocacy schemes for this group. Conclusions: A significant amount of work needs to be undertaken to improve the quality of care delivered to individuals with AIDD. An operational strategy promoting the widespread dissemination of information may not be the best approach to deliver quality care and optimal patient experience and patient advocacy. In addition, a more robust set of standards, with appropriate metrics, needs to be developed to assess organisational performance which will stand the test of professional and public scrutiny.

Keywords: advocacy, autism, health inequalities, intellectual developmental disabilities, quality of care

Procedia PDF Downloads 211
21315 Retrospective Analysis Demonstrates No Difference in Percutaneous Native Renal Biopsy Adequacy Between Nephrologists and Radiologists in University Hospital Crosshouse

Authors: Nicole Harley, Mahmoud Eid, Abdurahman Tarmal, Vishal Dey

Abstract:

Histological sampling plays an integral role in the diagnostic process of renal diseases. Percutaneous native renal biopsy is typically performed under ultrasound guidance, with this service usually being provided by nephrologists. In some centers, there is a role for radiologists in performing renal biopsies. Previous comparative studies have demonstrated non-inferiority between outcomes of percutaneous native renal biopsies performed by nephrologists compared with radiologists. We sought to compare biopsy adequacy between nephrologists and radiologists in University Hospital Crosshouse. The online system SERPR (Scottish Electronic Renal Patient Record) contains information pertaining to patients who have undergone renal biopsies. An online search was performed to acquire a list of all patients who underwent renal biopsy between 2013 and 2020 in University Hospital Crosshouse. 355 native renal biopsies were performed in total across this 7-year period. A retrospective analysis was performed on these cases, with records and reports being assessed for: the total number of glomeruli obtained per biopsy, whether the number of glomeruli obtained was adequate for diagnosis, as per an internationally agreed standard, and whether a histological diagnosis was achieved. Nephrologists performed 43.9% of native renal biopsies (n=156) and radiologists performed 56.1% (n=199). The mean number of glomeruli obtained by nephrologists was 17.16+/-10.31. The mean number of glomeruli obtained by radiologists was 18.38+/-10.55. T-test demonstrated no statistically significant difference between specialties comparatively (p-value 0.277). Native renal biopsies are required to obtain at least 8 glomeruli to be diagnostic as per internationally agreed criteria. Nephrologists met these criteria in 88.5% of native renal biopsies (n=138) and radiologists met this criteria in 89.5% (n=178). T-test and Chi-squared analysis demonstrate there was no statistically significant difference between the specialties comparatively (p-value 0.663 and 0.922, respectively). Biopsies performed by nephrologists yielded tissue that was diagnostic in 91.0% (n=142) of sampling. Biopsies performed by radiologists yielded tissue that was diagnostic in 92.4% (n=184) of sampling. T-test and Chi-squared analysis demonstrate there was no statistically significant difference between the specialties comparatively (p-value 0.625 and 0.889, respectively). This project demonstrates that at University Hospital Crosshouse, there is no statistical difference between radiologists and nephrologists in terms of glomeruli acquisition or samples achieving a histological diagnosis. Given the non-inferiority between specialties demonstrated by previous studies and this project, this evidence could support the restructuring of services to allow more renal biopsies to be performed by renal services and allow reallocation of radiology department resources.

Keywords: biopsy, medical imaging, nephrology, radiology

Procedia PDF Downloads 72
21314 Chromatographic Preparation and Performance on Zinc Ion Imprinted Monolithic Column and Its Adsorption Property

Authors: X. Han, S. Duan, C. Liu, C. Zhou, W. Zhu, L. Kong

Abstract:

The ionic imprinting technique refers to the three-dimensional rigid structure with the fixed pore sizes, which was formed by the binding interactions of ions and functional monomers and used ions as the template, it has a high level of recognition to the ionic template. The preparation of monolithic column by the in-situ polymerization need to put the compound of template, functional monomers, cross-linking agent and initiating agent into the solution, dissolve it and inject to the column tube, and then the compound will have a polymerization reaction at a certain temperature, after the synthetic reaction, we washed out the unread template and solution. The monolithic columns are easy to prepare, low consumption and cost-effective with fast mass transfer, besides, they have many chemical functions. But the monolithic columns have some problems in the practical application, such as low-efficiency, quantitative analysis cannot be performed accurately because of the peak shape is wide and has tailing phenomena; the choice of polymerization systems is limited and the lack of theoretical foundations. Thus the optimization of components and preparation methods is an important research direction. During the preparation of ionic imprinted monolithic columns, pore-forming agent can make the polymer generate the porous structure, which can influence the physical properties of polymer, what’ s more, it can directly decide the stability and selectivity of polymerization reaction. The compounds generated in the pre-polymerization reaction could directly decide the identification and screening capabilities of imprinted polymer; thus the choice of pore-forming agent is quite critical in the preparation of imprinted monolithic columns. This article mainly focuses on the research that when using different pore-forming agents, the impact of zinc ion imprinted monolithic column on the enrichment performance of zinc ion.

Keywords: high performance liquid chromatography (HPLC), ionic imprinting, monolithic column, pore-forming agent

Procedia PDF Downloads 207
21313 Exploring the Development of Communicative Skills in English Teaching Students: A Phenomenological Study During Online Instruction

Authors: Estephanie S. López Contreras, Vicente Aranda Palacios, Daniela Flores Silva, Felipe Oliveros Olivares, Romina Riquelme Escobedo, Iñaki Westerhout Usabiaga

Abstract:

This research explored whether the context of online instruction has influenced the development of first-year English-teaching students' communication skills, being these speaking and listening. The theoretical basis finds its niche in the need to bridge the gap in knowledge about the Chilean online educational context and the development of English communicative skills. An interpretative paradigm and a phenomenological design were implemented in this study. Twenty- two first-year students and two teachers from an English teaching training program participated in the study. The students' ages ranged from 18 to 26 years of age, and the teachers' years of experience ranged from 5 to 13 years in the program. For data collection purposes, semi- structured interviews were applied to both students and teachers. Interview questions were based on the initial conceptualization of the central phenomenon. Observations, field notes, and focus groups with the students are also part of the data collection process. Data analysis considered two-cycle methods. The first included descriptive coding for field notes, initial coding for interviews, and creating a codebook. The second cycle included axial coding for both field notes and interviews. After data analysis, the findings show that students perceived online classes as instances in which active communication cannot always occur. In addition, changes made to the curricula as a consequence of the COVID-19 pandemic have affected students' speaking and listening skills.

Keywords: attitudes, communicative skills, EFL teaching training program, online instruction, and perceptions

Procedia PDF Downloads 114
21312 Collaborative Online Learning for Lecturers

Authors: Lee Bih Ni, Emily Doreen Lee, Wee Hui Yean

Abstract:

This paper was prepared to see the perceptions of online lectures regarding collaborative learning, in terms of how lecturers view online collaborative learning in the higher learning institution. The purpose of this study was conducted to determine the perceptions of online lectures about collaborative learning, especially how lecturers see online collaborative learning in the university. Adult learning education enhance collaborative learning culture with the target of involving learners in the learning process to make teaching and learning more effective and open at the university. This will finally make students learning that will assist each other. It is also to cut down the pressure of loneliness and isolation might felt among adult learners. Their ways in collaborative online was also determined. In this paper, researchers collect data using questionnaires instruments. The collected data were analyzed and interpreted. By analyzing the data, researchers report the results according the proof taken from the respondents. Results from the study, it is not only dependent on the lecturer but also a student to shape a good collaborative learning practice. Rational concepts and pattern to achieve these targets be clear right from the beginning and may be good seen by a number of proposals submitted and include how the higher learning institution has trained with ongoing lectures online. Advantages of online collaborative learning show that lecturers should be trained effectively. Studies have seen that the lecturer aware of online collaborative learning. This positive attitude will encourage the higher learning institution to continue to give the knowledge and skills required.

Keywords: collaborative online learning, lecturers’ training, learning, online

Procedia PDF Downloads 447
21311 Finite Element Modeling and Nonlinear Analysis for Seismic Assessment of Off-Diagonal Steel Braced RC Frame

Authors: Keyvan Ramin

Abstract:

The geometric nonlinearity of Off-Diagonal Bracing System (ODBS) could be a complementary system to covering and extending the nonlinearity of reinforced concrete material. Finite element modeling is performed for flexural frame, x-braced frame and the ODBS braced frame system at the initial phase. Then the different models are investigated along various analyses. According to the experimental results of flexural and x-braced frame, the verification is done. Analytical assessments are performed in according to three-dimensional finite element modeling. Non-linear static analysis is considered to obtain performance level and seismic behavior, and then the response modification factors calculated from each model’s pushover curve. In the next phase, the evaluation of cracks observed in the finite element models, especially for RC members of all three systems is performed. The finite element assessment is performed on engendered cracks in ODBS braced frame for various time steps. The nonlinear dynamic time history analysis accomplished in different stories models for three records of Elcentro, Naghan, and Tabas earthquake accelerograms. Dynamic analysis is performed after scaling accelerogram on each type of flexural frame, x-braced frame and ODBS braced frame one by one. The base-point on RC frame is considered to investigate proportional displacement under each record. Hysteresis curves are assessed along continuing this study. The equivalent viscous damping for ODBS system is estimated in according to references. Results in each section show the ODBS system has an acceptable seismic behavior and their conclusions have been converged when the ODBS system is utilized in reinforced concrete frame.

Keywords: FEM, seismic behaviour, pushover analysis, geometric nonlinearity, time history analysis, equivalent viscous damping, passive control, crack investigation, hysteresis curve

Procedia PDF Downloads 374
21310 Multi-Level Clustering Based Congestion Control Protocol for Cyber Physical Systems

Authors: Manpreet Kaur, Amita Rani, Sanjay Kumar

Abstract:

The Internet of Things (IoT), a cyber-physical paradigm, allows a large number of devices to connect and send the sensory data in the network simultaneously. This tremendous amount of data generated leads to very high network load consequently resulting in network congestion. It further amounts to frequent loss of useful information and depletion of significant amount of nodes’ energy. Therefore, there is a need to control congestion in IoT so as to prolong network lifetime and improve the quality of service (QoS). Hence, we propose a two-level clustering based routing algorithm considering congestion score and packet priority metrics that focus on minimizing the network congestion. In the proposed Priority based Congestion Control (PBCC) protocol the sensor nodes in IoT network form clusters that reduces the amount of traffic and the nodes are prioritized to emphasize important data. Simultaneously, a congestion score determines the occurrence of congestion at a particular node. The proposed protocol outperforms the existing Packet Discard Network Clustering (PDNC) protocol in terms of buffer size, packet transmission range, network region and number of nodes, under various simulation scenarios.

Keywords: internet of things, cyber-physical systems, congestion control, priority, transmission rate

Procedia PDF Downloads 298
21309 Correlation Matrix for Automatic Identification of Meal-Taking Activity

Authors: Ghazi Bouaziz, Abderrahim Derouiche, Damien Brulin, Hélène Pigot, Eric Campo

Abstract:

Automatic ADL classification is a crucial part of ambient assisted living technologies. It allows to monitor the daily life of the elderly and to detect any changes in their behavior that could be related to health problem. But detection of ADLs is a challenge, especially because each person has his/her own rhythm for performing them. Therefore, we used a correlation matrix to extract custom rules that enable to detect ADLs, including eating activity. Data collected from 3 different individuals between 35 and 105 days allows the extraction of personalized eating patterns. The comparison of the results of the process of eating activity extracted from the correlation matrices with the declarative data collected during the survey shows an accuracy of 90%.

Keywords: elderly monitoring, ADL identification, matrix correlation, meal-taking activity

Procedia PDF Downloads 84
21308 Digital Twin for Retail Store Security

Authors: Rishi Agarwal

Abstract:

Digital twins are emerging as a strong technology used to imitate and monitor physical objects digitally in real time across sectors. It is not only dealing with the digital space, but it is also actuating responses in the physical space in response to the digital space processing like storage, modeling, learning, simulation, and prediction. This paper explores the application of digital twins for enhancing physical security in retail stores. The retail sector still relies on outdated physical security practices like manual monitoring and metal detectors, which are insufficient for modern needs. There is a lack of real-time data and system integration, leading to ineffective emergency response and preventative measures. As retail automation increases, new digital frameworks must control safety without human intervention. To address this, the paper proposes implementing an intelligent digital twin framework. This collects diverse data streams from in-store sensors, surveillance, external sources, and customer devices and then Advanced analytics and simulations enable real-time monitoring, incident prediction, automated emergency procedures, and stakeholder coordination. Overall, the digital twin improves physical security through automation, adaptability, and comprehensive data sharing. The paper also analyzes the pros and cons of implementation of this technology through an Emerging Technology Analysis Canvas that analyzes different aspects of this technology through both narrow and wide lenses to help decision makers in their decision of implementing this technology. On a broader scale, this showcases the value of digital twins in transforming legacy systems across sectors and how data sharing can create a safer world for both retail store customers and owners.

Keywords: digital twin, retail store safety, digital twin in retail, digital twin for physical safety

Procedia PDF Downloads 62