Search results for: food frequency and biomedical data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30070

Search results for: food frequency and biomedical data

25510 Algorithm Optimization to Sort in Parallel by Decreasing the Number of the Processors in SIMD (Single Instruction Multiple Data) Systems

Authors: Ali Hosseini

Abstract:

Paralleling is a mechanism to decrease the time necessary to execute the programs. Sorting is one of the important operations to be used in different systems in a way that the proper function of many algorithms and operations depend on sorted data. CRCW_SORT algorithm executes ‘N’ elements sorting in O(1) time on SIMD (Single Instruction Multiple Data) computers with n^2/2-n/2 number of processors. In this article having presented a mechanism by dividing the input string by the hinge element into two less strings the number of the processors to be used in sorting ‘N’ elements in O(1) time has decreased to n^2/8-n/4 in the best state; by this mechanism the best state is when the hinge element is the middle one and the worst state is when it is minimum. The findings from assessing the proposed algorithm by other methods on data collection and number of the processors indicate that the proposed algorithm uses less processors to sort during execution than other methods.

Keywords: CRCW, SIMD (Single Instruction Multiple Data) computers, parallel computers, number of the processors

Procedia PDF Downloads 303
25509 Increasing the System Availability of Data Centers by Using Virtualization Technologies

Authors: Chris Ewe, Naoum Jamous, Holger Schrödl

Abstract:

Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.

Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization

Procedia PDF Downloads 528
25508 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt

Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer

Abstract:

Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.

Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening

Procedia PDF Downloads 29
25507 Rural Territorial Sustainable Development: Interinstitutional Dialogue and Transition to Sustainable Livelihoods

Authors: Aico Nogueira

Abstract:

This paper examines the interinstitutional dialogues within the Brazilian federal structures, which comprises federal, state and local levels, around the themes of new approaches and interventions aimed to promote sustainable rural development, particularly rural development as part of a territorial approach. The work seeks to understand to what extent the various levels of the state interact with these strategies, particularly with the locally constituted powers, focusing on the importance of the transition of traditional agriculture methods to more sustainable agroecological systems and its effects on food security and sustainable rural development. The research analyses as case studies the Sustainable Rural Territories Development Program (PRONAT) of the Ministry of Agrarian Development at the federal level, as well as the State of São Paulo and the Vale do Ribeira Territory, an area characterized by environmental and social vulnerability, restrictive environmental laws and attempts to promote sustainable development. In order to examine how the interrelationships between different levels of governance and civil society, in addition to the neo-institutionalist polity centered literature, the research uses an adaptation of the concept of arena in Ostrom and Hannigan, produced at different scales of decision-making processes, as well as the multilevel governance literature. Document analysis, interviews, focus groups and direct observation techniques are also used. The main findings of this study are that how different levels of governance understand and organize themselves for this work and have a direct impact on the actions taken. Consequently, programs formulated for this purpose are not associated with the creation of institutions capable of breaking with a traditional sectoral view that has historically prevailed in policymaking. And the transition from traditional agriculture to agroecological production systems is hampered by a sectorial foundation, based on large-scale production and the strengthening of the traditional country's land concentration model.

Keywords: agroecology, food security, inter-institutional dialogue, rural poverty, sustainable rural development, territorial development

Procedia PDF Downloads 195
25506 Review of Different Machine Learning Algorithms

Authors: Syed Romat Ali Shah, Bilal Shoaib, Saleem Akhtar, Munib Ahmad, Shahan Sadiqui

Abstract:

Classification is a data mining technique, which is recognizedon Machine Learning (ML) algorithm. It is used to classifythe individual articlein a knownofinformation into a set of predefinemodules or group. Web mining is also a portion of that sympathetic of data mining methods. The main purpose of this paper to analysis and compare the performance of Naïve Bayse Algorithm, Decision Tree, K-Nearest Neighbor (KNN), Artificial Neural Network (ANN)and Support Vector Machine (SVM). This paper consists of different ML algorithm and their advantages and disadvantages and also define research issues.

Keywords: Data Mining, Web Mining, classification, ML Algorithms

Procedia PDF Downloads 290
25505 Using Genetic Algorithms and Rough Set Based Fuzzy K-Modes to Improve Centroid Model Clustering Performance on Categorical Data

Authors: Rishabh Srivastav, Divyam Sharma

Abstract:

We propose an algorithm to cluster categorical data named as ‘Genetic algorithm initialized rough set based fuzzy K-Modes for categorical data’. We propose an amalgamation of the simple K-modes algorithm, the Rough and Fuzzy set based K-modes and the Genetic Algorithm to form a new algorithm,which we hypothesise, will provide better Centroid Model clustering results, than existing standard algorithms. In the proposed algorithm, the initialization and updation of modes is done by the use of genetic algorithms while the membership values are calculated using the rough set and fuzzy logic.

Keywords: categorical data, fuzzy logic, genetic algorithm, K modes clustering, rough sets

Procedia PDF Downloads 240
25504 Nondestructive Prediction and Classification of Gel Strength in Ethanol-Treated Kudzu Starch Gels Using Near-Infrared Spectroscopy

Authors: John-Nelson Ekumah, Selorm Yao-Say Solomon Adade, Mingming Zhong, Yufan Sun, Qiufang Liang, Muhammad Safiullah Virk, Xorlali Nunekpeku, Nana Adwoa Nkuma Johnson, Bridget Ama Kwadzokpui, Xiaofeng Ren

Abstract:

Enhancing starch gel strength and stability is crucial. However, traditional gel property assessment methods are destructive, time-consuming, and resource-intensive. Thus, understanding ethanol treatment effects on kudzu starch gel strength and developing a rapid, nondestructive gel strength assessment method is essential for optimizing the treatment process and ensuring product quality consistency. This study investigated the effects of different ethanol concentrations on the microstructure of kudzu starch gels using a comprehensive microstructural analysis. We also developed a nondestructive method for predicting gel strength and classifying treatment levels using near-infrared (NIR) spectroscopy, and advanced data analytics. Scanning electron microscopy revealed progressive network densification and pore collapse with increasing ethanol concentration, correlating with enhanced mechanical properties. NIR spectroscopy, combined with various variable selection methods (CARS, GA, and UVE) and modeling algorithms (PLS, SVM, and ELM), was employed to develop predictive models for gel strength. The UVE-SVM model demonstrated exceptional performance, with the highest R² values (Rc = 0.9786, Rp = 0.9688) and lowest error rates (RMSEC = 6.1340, RMSEP = 6.0283). Pattern recognition algorithms (PCA, LDA, and KNN) successfully classified gels based on ethanol treatment levels, achieving near-perfect accuracy. This integrated approach provided a multiscale perspective on ethanol-induced starch gel modification, from molecular interactions to macroscopic properties. Our findings demonstrate the potential of NIR spectroscopy, coupled with advanced data analysis, as a powerful tool for rapid, nondestructive quality assessment in starch gel production. This study contributes significantly to the understanding of starch modification processes and opens new avenues for research and industrial applications in food science, pharmaceuticals, and biomaterials.

Keywords: kudzu starch gel, near-infrared spectroscopy, gel strength prediction, support vector machine, pattern recognition algorithms, ethanol treatment

Procedia PDF Downloads 24
25503 Forecasting Amman Stock Market Data Using a Hybrid Method

Authors: Ahmad Awajan, Sadam Al Wadi

Abstract:

In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.

Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series

Procedia PDF Downloads 122
25502 Swedish–Nigerian Extrusion Research: Channel for Traditional Grain Value Addition

Authors: Kalep Filli, Sophia Wassén, Annika Krona, Mats Stading

Abstract:

Food security challenge and the growing population in Sub-Saharan Africa centers on its agricultural transformation, where about 70% of its population is directly involved in farming. Research input can create economic opportunities, reduce malnutrition and poverty, and generate faster, fairer growth. Africa is discarding $4 billion worth of grain annually due to pre and post-harvest losses. Grains and tubers play a central role in food supply in the region but their production has generally lagged behind because no robust scientific input to meet up with the challenge. The African grains are still chronically underutilized to the detriment of the well-being of the people of Africa and elsewhere. The major reason for their underutilization is because they are under-researched. Any commitment by scientific community to intervene needs creative solutions focused on innovative approaches that will meet the economic growth. In order to mitigate this hurdle, co-creation activities and initiatives are necessary.An example of such initiatives has been initiated through Modibbo Adama University of Technology Yola, Nigeria and RISE (The Research Institutes of Sweden) Gothenburg, Sweden. Exchange of expertise in research activities as a possibility to create channel for value addition to agricultural commodities in the region under the ´Traditional Grain Network programme´ is in place. Process technologies, such as extrusion offers the possibility of creating products in the food and feed sectors, with better storage stability, added value, lower transportation cost and new markets. The Swedish–Nigerian initiative has focused on the development of high protein pasta. Dry microscopy of pasta sample result shows a continuous structural framework of proteins and starch matrix. The water absorption index (WAI) results showed that water was absorbed steadily and followed the master curve pattern. The WAI values ranged between 250 – 300%. In all aspect, the water absorption history was within a narrow range for all the eight samples. The total cooking time for all the eight samples in our study ranged between 5 – 6 minutes with their respective dry sample diameter ranging between 1.26 – 1.35 mm. The percentage water solubility index (WSI) ranged from 6.03 – 6.50% which was within a narrow range and the cooking loss which is a measure of WSI is considered as one of the main parameters taken into consideration during the assessment of pasta quality. The protein contents of the samples ranged between 17.33 – 18.60 %. The value of the cooked pasta firmness ranged from 0.28 - 0.86 N. The result shows that increase in ratio of cowpea flour and level of pregelatinized cowpea tends to increase the firmness of the pasta. The breaking strength represent index of toughness of the dry pasta ranged and it ranged from 12.9 - 16.5 MPa.

Keywords: cowpea, extrusion, gluten free, high protein, pasta, sorghum

Procedia PDF Downloads 185
25501 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems

Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell

Abstract:

Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.

Keywords: building information modeling, BIM, facilities management systems, interoperability, information management

Procedia PDF Downloads 107
25500 Behavioral Pattern of 2G Mobile Internet Subscribers: A Study on an Operator of Bangladesh

Authors: Azfar Adib

Abstract:

Like many other countries of the world, mobile internet has been playing a key role in the growth of internet subscriber base in Bangladesh. This study has attempted to identify particular behavioral or usage patterns of 2G mobile internet subscribers who were using the service of the topmost internet service provider (as well as the top mobile operator) of Bangladesh prior to the launching of 3G services (when 2G was fully dominant). It contains some comprehensive analysis carried on different info regarding 2G mobile internet subscribers, obtained from the operator’s own network insights.This is accompanied by the results of a survey conducted among 40 high-frequency users of this service.

Keywords: mobile internet, Symbian, Android, iPhone

Procedia PDF Downloads 432
25499 LCA of Waste Disposal from Olive Oil Production: Anaerobic Digestion and Conventional Disposal on Soil

Authors: T. Tommasi, E. Batuecas, G. Mancini, G. Saracco, D. Fino

Abstract:

Extra virgin olive-oil (EVO) production is an important economic activity for several countries, especially in the Mediterranean area such as Spain, Italy, Greece and Tunisia. The two major by-products from olive oil production, solid-liquid Olive Pomace (OP) and the Olive Mill Waste Waters (OMWW), are still mainly disposed on soil, in spite of the existence of legislation which already limits this practice. The present study compares the environmental impacts associated with two different scenarios for the management of waste from olive oil production through a comparative Life Cycle Assessment (LCA). The two alternative scenarios are: (I) Anaerobic Digestion and (II) current Disposal on soil. The analysis was performed through SimaPro software and the assessment of the impact categories was based on International Life Cycle Data and Cumulative Energy Demand methods. Both the scenarios are mostly related to the cultivation and harvesting phase and are highly dependent on the irrigation practice and related energy demand. Results from the present study clearly show that as the waste disposal on soil causes the worst environmental performance of all the impact categories here considered. Important environmental benefits have been identified when anaerobic digestion is instead chosen as the final treatment. It was consequently demonstrated that anaerobic digestion should be considered a feasible alternative for olive mills, to produce biogas from common olive oil residues, reducing the environmental burden and adding value to the olive oil production chain.

Keywords: anaerobic digestion, waste management, agro-food waste, biogas

Procedia PDF Downloads 139
25498 Mechanical and Biodegradability of Porous Poly-ε-Caprolactone/Polyethylene Glycol Copolymer-Reinforced Cellulose Nanofibers for Soft Tissue Engineering Applications

Authors: Mustafa Abu Ghalia, Mohammed Seddik

Abstract:

The design and development of a new class of biomaterial has gained particular interest in producing polymer scaffold for biomedical applications. Improving mechanical properties, biological and controlling pores scaffold are important factors to provide appropriate biomaterial for implement in soft tissue repair and regeneration. In this study, poly-ε-caprolactone (PCL) /polyethylene glycol (PEG) copolymer (80/20) incorporated with CNF scaffolds were made employing solvent casting and particulate leaching methods. Four mass percentages of CNF (1, 2.5, 5, and 10 wt.%) were integrated into the copolymer through a silane coupling agent. Mechanical properties were determined using Tensile Tester data acquisition to investigate the effect of porosity, pore size, and CNF contents. Tensile strength obtained for PCL/PEG- 5 wt.% CNF was 16 MPa, which drastically decreased after creating a porous structure to 7.1 MPa. The optimum parameters of the results were found to be 5 wt.% for CNF, 240 μm for pore size, and 83% for porosity. Scanning electron microscopy (SEM) micrograph reveals that consistent pore size and regular pore shape were accomplished after the addition of CNF-5 wt. % into PCL/PEG. The results of mass loss of PCL/PEG reinforced-CNF 1% have clearly enhanced to double values compared with PCL/PEG copolymer and three times with PCL/PEG scaffold-CNF 1%. In addition, all PCL/PEG reinforced and scaffold- CNF were partially disintegrated under composting conditions confirming their biodegradable behavior. This also provides a possible solution for the end life of these biomaterials.

Keywords: PCL/PEG, cellulose nanofibers, tissue engineering, biodegradation, compost polymers

Procedia PDF Downloads 53
25497 First Order Filter Based Current-Mode Sinusoidal Oscillators Using Current Differencing Transconductance Amplifiers (CDTAs)

Authors: S. Summart, C. Saetiaw, T. Thosdeekoraphat, C. Thongsopa

Abstract:

This article presents new current-mode oscillator circuits using CDTAs which is designed from block diagram. The proposed circuits consist of two CDTAs and two grounded capacitors. The condition of oscillation and the frequency of oscillation can be adjusted by electronic method. The circuits have high output impedance and use only grounded capacitors without any external resistor which is very appropriate to future development into an integrated circuit. The results of PSPICE simulation program are corresponding to the theoretical analysis.

Keywords: current-mode, quadrature oscillator, block diagram, CDTA

Procedia PDF Downloads 451
25496 The Roots of Amazonia’s Droughts and Floods: Complex Interactions of Pacific and Atlantic Sea-Surface Temperatures

Authors: Rosimeire Araújo Silva, Philip Martin Fearnside

Abstract:

Extreme droughts and floods in the Amazon have serious consequences for natural ecosystems and the human population in the region. The frequency of these events has increased in recent years, and projections of climate change predict greater frequency and intensity of these events. Understanding the links between these extreme events and different patterns of sea surface temperature in the Atlantic and Pacific Oceans is essential, both to improve the modeling of climate change and its consequences and to support efforts of adaptation in the region. The relationship between sea temperatures and events in the Amazon is much more complex than is usually assumed in climatic models. Warming and cooling of different parts of the oceans, as well as the interaction between simultaneous temperature changes in different parts of each ocean and between the two oceans, have specific consequences for the Amazon, with effects on precipitation that vary in different parts of the region. Simplistic generalities, such as the association between El Niño events and droughts in the Amazon, do not capture this complexity. We investigated the variability of Sea Surface Temperature (SST) in the Tropical Pacific Ocean during the period 1950-2022, using Empirical Orthogonal Functions (FOE), spectral analysis coherence and wavelet phase. The two were identified as the main modes of variability, which explain about 53,9% and 13,3%, respectively, of the total variance of the data. The spectral and coherence analysis and wavelets phase showed that the first selected mode represents the warming in the central part of the Pacific Ocean (the “Central El Niño”), while the second mode represents warming in the eastern part of the Pacific (the “Eastern El Niño The effects of the 1982-1983 and 1976-1977 El Niño events in the Amazon, although both events were characterized by an increase in sea surface temperatures in the Equatorial Pacific, the impact on rainfall in the Amazon was distinct. In the rainy season, from December to March, the sub-basins of the Japurá, Jutaí, Jatapu, Tapajós, Trombetas and Xingu rivers were the regions that showed the greatest reductions in rainfall associated with El Niño Central (1982-1983), while the sub-basins of the Javari, Purus, Negro and Madeira rivers had the most pronounced reductions in the year of Eastern El Niño (1976-1977). In the transition to the dry season, in April, the greatest reductions were associated with the Eastern El Niño year for the majority of the study region, with the exception only of the sub-basins of the Madeira, Trombetas and Xingu rivers, which had their associated reductions to Central El Niño. In the dry season from July to September, the sub-basins of the Japurá Jutaí Jatapu Javari Trombetas and Madeira rivers were the rivers that showed the greatest reductions in rainfall associated with El Niño Central, while the sub-basins of the Tapajós Purus Negro and Xingu rivers had the most pronounced reductions. In the Eastern El Niño year this season. In this way, it is possible to conclude that the Central (Eastern) El Niño controlled the reductions in soil moisture in the dry (rainy) season for all sub-basins shown in this study. Extreme drought events associated with these meteorological phenomena can lead to a significant increase in the occurrence of forest fires. These fires have a devastating impact on Amazonian vegetation, resulting in the irreparable loss of biodiversity and the release of large amounts of carbon stored in the forest, contributing to the increase in the greenhouse effect and global climate change.

Keywords: sea surface temperature, variability, climate, Amazon

Procedia PDF Downloads 53
25495 Alternative Seed System for Enhanced Availability of Quality Seeds and Seed/Varietal Replacement Rate - An Experience

Authors: Basave Gowda, Lokesh K., Prasanth S. M., Bellad S. B., Radha J., Lokesh G. Y., Patil S. B., Vijayakumar D. K., Ganigar B. S., Rakesh C. Mathad

Abstract:

Quality seed plays an important role in enhancing the crop productivity. It was reported and confirmed by large scale verification research trials that by use of quality seeds alone, the crop yield can be enhanced by 15 to 20 per cent. At present, the quality seed production and distribution through organised sectors comprising both public and private seed sector was only 20-25% of the requirement and the remaining quantity is met through unorganised sector which include the farmer to farmers saved seeds. With an objective of developing an alternative seed system, the University of Agricultural Sciences, Raichur in Karnataka state has implemented Seed Village Programme in more than 100 villages covering around 5000 farmers every year since 2009-10 and in the selected seed villages, a group of 50-150 farmers were supplied the foundation seeds of new varieties to an extent of 0.4 ha at 50 % subsidy. And two to three training programmes were conducted in the targeted villages for quality seed production and the seed produced in the target group was processed locally in the university seed processing units and arranged for distribution in the local villages by the seed growers themselves. By this new innovative and modified seed system, the university can able to replace old varieties of pigeon pea and green gram by producing 1482, 2978, 2729, 2560, and 4581 tonnes of seeds of new varieties on large scale under farmers and scientists participatory seed village programmes respectively during 2009-10, 2010-11, 2011-12, 2012-13 and 2013-14. From this new alternate model of seed system, there should be large scale promotion of regional seed system involving farmers, NGO and voluntary organisation for quick and effective replacement of old, low yielding, disease susceptible varieties with new high yielding, disease resistant for enhanced food production and food security.

Keywords: seed system, seed village, seed replacement, varietal replacement

Procedia PDF Downloads 423
25494 Food for Thought: Preparing the Brain to Eat New Foods through “Messy” Play

Authors: L. Bernabeo, T. Loftus

Abstract:

Many children often experience phases of picky eating, food aversions and/or avoidance. For families with children who have special needs, these experiences are often exacerbated, which can lead to feelings that negatively impact a caregiver’s relationship with their child. Within the scope of speech language pathology practice, knowledge of both emotional and feeding development is key. This paper will explore the significance of “messy play” within typical feeding development, and the challenges that may arise if a child does not have the opportunity to engage in this type of exploratory play. This paper will consider several contributing factors that can result in a “picky eater.” Further, research has shown that individuals with special needs, including autism, possess a neurological makeup that differs from that of a typical individual. Because autism is a disorder of relating and communicating due to differences in the limbic system, an individual with special needs may respond to a typical feeding experience as if it is a traumatic event. As a result, broadening one’s dietary repertoire may seem to be an insurmountable challenge. This paper suggests that introducing new foods through exploratory play can help broaden and strengthen diets, as well as improve the feeding experience, of individuals with autism. The DIRFloortimeⓇ methodology stresses the importance of following a child's lead. Within this developmental model, there is a special focus on a person’s individual differences, including the unique way they process the world around them, as well as the significance of therapy occurring within the context of a strong and motivating relationship. Using this child-centered approach, we can support our children in expanding their diets, while simultaneously building upon their cognitive and creative development through playful and respectful interactions that include exposure to foods that differ in color, texture, and smell. Further, this paper explores the importance of exploration, self-feeding and messy play on brain development, both in the context of typically developing individuals and those with disordered development.

Keywords: development, feeding, floortime, sensory

Procedia PDF Downloads 111
25493 Cone Beam Computed Tomography: A Useful Diagnostic Tool to Determine Root Canal Morphology in a Sample of Egyptian Population

Authors: H. El-Messiry, M. El-Zainy, D. Abdelkhalek

Abstract:

Cone-beam computed tomography (CBCT) provides high-quality 3-dimensional images of dental structures because of its high spatial resolution. The study of dental morphology is important in research as it provides information about diversities within a population. Many studies have shown different shapes and numbers of roots canals among different races, especially in molars. The aim of this study was to determine the morphology of root canals of mandibular first and third molars in a sample of Egyptian population using CBCT scanning. Fifty mandibular first Molars (M1) and fifty mandibular third (M3) extracted molars were collected. Thick rectangular molds were made using pink wax to hold the samples. Molars were embedded in the wax mold by aligning them in rows leaving arbitrary 0.5cm space between them. The molds with the samples in were submitted for CBCT scan. The number and morphology of root canals were assessed and classified according to Vertucci's classification. The mesial and the distal roots were examined separately. Finally, data was analyzed using Fisher exact test. The most prevalent mesial root canal frequency in M1 was type IV (60%) and type II (40 %), while M3 showed prevalence of type I (40%) and II (40%). Distal root canal morphology showed prevalence of type I in both M1 (66%) and M3 (86%). So, it can be concluded that CBCT scanning provides supplemental information about the root canal configurations of mandibular molars in a sample of Egyptian population. This study may help clinicians in the root canal treatment of mandibular molars.

Keywords: cone beam computed tomography, mandibular first molar, mandibular third molar, root canal morphology

Procedia PDF Downloads 312
25492 Performance Evaluation of a Wireless 433 MHz Link in Underwater-Freshwater Communication

Authors: Xavi Vilajosana Guillen, Emilio José Pérez Salgado

Abstract:

This document presents experimental results obtained in a realistic environment using an underwater LoRa link. It aims to analyze the behavior of electromagnetic waves underwater and determine this communication capability. With this it has been tried to empirically evaluate the results obtained in the mathematical model using a commercial device with low cost and low consumption that works at frequency 433Mhz. The mathematical results obtained for wireless communication at 433Mhz underwater indicate that a communication of up to 7.5 m is possible, however experimentally 8 m has been achieved.

Keywords: 433Mhz link, internet of things, LoRa link, underwater communication

Procedia PDF Downloads 60
25491 The Use of Complementary and Alternative Medicine for Pain Relief in the Elderly: An Investigational Analysis of Seniors Residing in an Independent/Assisted Seniors’ Living Facility

Authors: Carol Cameletti

Abstract:

The goal of this study was to perform a pilot survey to assess pain frequency and intensity in an elderly population and to assess treatment options for chronic pain that include complementary and alternative medicines (CAM). Ten participants were recruited from an independent and supportive living housing facility in Northern Ontario and asked to complete two questionnaires: 1) a self-assessment on pain, and 2) the use of CAM for pain. Results from our study show that 80% of the participants experienced pains other than the regular everyday pains such as minor headaches, sprains or toothaches. Although participants stated that on average the highest level of pain they experienced within the past 24 hours had a score of 6.5 (0=no pain, 10=worst pain imaginable) the level of pain they experienced moderately interfered with their daily activities. Unfortunately, participants stated that they were only able to attain minimal levels of pain relief using treatments or medications causing some of the participants to seek alternative therapies or self-help practices. The most commonly used CAMs were vitamins/minerals, herbs and supplements, and self-help practices such as meditation, prayer, visualization and relaxation techniques. Although some of the participants stated that they had received complementary treatments directly from their physician, four of the nine participants said that they had not disclosed CAM use to their physician thereby indicating a need to open the lines of communication between healthcare providers and patients with regards to CAM use. It is our hope that the data generated from this study will serve as the platform for a pain management clinic that is client-centered, consumer-driven and truly integrative and tailored in order to meet the unique needs of older adults in Great Sudbury, Ontario.

Keywords: alternative, complementary, elderly, medicine

Procedia PDF Downloads 176
25490 Analysis of Scholarly Communication Patterns in Korean Studies

Authors: Erin Hea-Jin Kim

Abstract:

This study aims to investigate scholarly communication patterns in Korean studies, which focuses on all aspects of Korea, including history, culture, literature, politics, society, economics, religion, and so on. It is called ‘national study or home study’ as the subject of the study is itself, whereas it is called ‘area study’ as the subject of the study is others, i.e., outside of Korea. Understanding of the structure of scholarly communication in Korean studies is important since the motivations, procedures, results, or outcomes of individual studies may be affected by the cooperative relationships that appear in the communication structure. To this end, we collected 1,798 articles with the (author or index) keyword ‘Korean’ published in 2018 from the Scopus database and extracted the institution and country of the authors using a text mining technique. A total of 96 countries, including South Korea, was identified. Then we constructed a co-authorship network based on the countries identified. The indicators of social network analysis (SNA), co-occurrences, and cluster analysis were used to measure the activity and connectivity of participation in collaboration in Korean studies. As a result, the highest frequency of collaboration appears in the following order: S. Korea with the United States (603), S. Korea with Japan (146), S. Korea with China (131), S. Korea with the United Kingdom (83), and China with the United States (65). This means that the most active participants are S. Korea as well as the USA. The highest rank in the role of mediator measured by betweenness centrality appears in the following order: United States (0.165), United Kingdom (0.045), China (0.043), Japan (0.037), Australia (0.026), and South Africa (0.023). These results show that these countries contribute to connecting in Korean studies. We found two major communities among the co-authorship network. Asian countries and America belong to the same community, and the United Kingdom and European countries belong to the other community. Korean studies have a long history, and the study has emerged since Japanese colonization. However, Korean studies have never been investigated by digital content analysis. The contributions of this study are an analysis of co-authorship in Korean studies with a global perspective based on digital content, which has not attempted so far to our knowledge, and to suggest ideas on how to analyze the humanities disciplines such as history, literature, or Korean studies by text mining. The limitation of this study is that the scholarly data we collected did not cover all domestic journals because we only gathered scholarly data from Scopus. There are thousands of domestic journals not indexed in Scopus that we can consider in terms of national studies, but are not possible to collect.

Keywords: co-authorship network, Korean studies, Koreanology, scholarly communication

Procedia PDF Downloads 149
25489 Data Security and Privacy Challenges in Cloud Computing

Authors: Amir Rashid

Abstract:

Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.

Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud

Procedia PDF Downloads 294
25488 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 81
25487 Prevalence and Factors Associated with Illicit Drug Use Among Undergraduate Students in the University of Lagos, Nigeria

Authors: Abonyi, Emmanuel Ebuka, Amina Jafaru O.

Abstract:

Background: Illicit substance use among students is a phenomenon that has been widely studied, but it remains of interest due to its high prevalence and potential consequences. It is a major mental health concern among university students which may result in behavioral and academic problems, psychiatric disorders, and infectious diseases. Thus, this study was done to ascertain the prevalence and factors associated with the use of illicit drugs among these groups of people. Methods: A cross-sectional and descriptive survey was conducted among undergraduate students of the University of Lagos for the duration of three(3) months (August to October 2021). A total number of 938 undergraduate students were selected from seventeen faculties in the university. Pretested questionnaires were administered, completed, and returned. The data were analyzed using descriptive statistics and multivariate regression analysis. Results: From the data collected, it was observed that out of 938 undergraduate students of the University of Lagos that completed and returned the questionnaires, 56.3% were female and 43.7% were male. No gender differences were observed in the prevalence of use of any of the illicit substances. The result showed that the majority of the students that participated in the research were females(56.6%); it was observed that there were a total of 541 2nd-year students(57.7%) and 397 final-year students(42.3). Students between the age brackets of 20- 24 years had the highest frequency of 648(69.1%) of illicit drug use and students in none health-related disciplines. The result also showed that the majority of the students reported that they use Marijuana (31.7%), while lifetime use of LSD (6.3%), Heroin(4.8%), Cocaine (4.7%), and Ecstasy(4.5), Ketamine (3.4%). Besides, the use of alcohol was below average(44.1%). Additionally, Marijuana was among the ones that were mostly taken by students having a higher percentage and most of these respondents had experienced relationship problems with their family and intentions (50.9%). From the responses obtained, major reasons students indulge in illicit drug use were; curiosity to experiment, relief of stress after rigorous academic activities, social media influence, and peer pressure. Most Undergraduate students are in their most hyperactive stage in life, which makes them vulnerable to always want to explore practically every adventure. Hence, individual factors and social media influence are identified as major contributors to the prevalence of illicit drug use among undergraduate students at the University of Lagos, Nigeria. Conclusion: Control programs are most needed among the students. They should be comprehensive and focused on students' psycho-education about substances and their related negative consequences, plus the promotion of students' life skills, and integration into the family – and peer-based preventive interventions.

Keywords: illicit drugs, addiction, undergraduate students, prevalence, substances

Procedia PDF Downloads 98
25486 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 174
25485 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 332
25484 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 72
25483 Reclamation of Saline and Alkaline Soils through Aquaculture: A Review and Prospects for Future Research

Authors: M. Shivakumar, S. R. Somashekhar, C. V. Raju

Abstract:

Secondary salinization of agricultural lands in any command areas of the world is the major issue in the recent past. Currently, it is estimated that the 954 mh of saline and alkaline soil is present in the world. Thousands of hectares of land, getting added every year. Argentina, Bangladesh and Australia are most affected countries. In India, out of 142.80 million hectare (mh) cropped area, 56 mh is irrigated area. Of which, more than 9 mh (about 16.%) of land is found to be alkaline/saline. Due to continuous utilization of same land for same agricultural activities, excessive usage of fertilizers and water, most of the soils have become alkaline, saline or water logged. These lands are low productive and at times totally unfit for agricultural activities. These soils may or may not posses good physical condition, but plants may suffer from its inability to absorb water from salty solution. Plants suffer from dehydration and loose water to the soil, shrink, resulting death of plant. This process is called plasmolysis. It is the fact that soil is an independent, organic body of nature that acquires properties in accordance with forces which act upon it. Aquaculture is one of the solutions to utilize such problematic soils for food production. When the impoundments are constructed in an area 10-15% of the affected areas, the excess water along with the salts gets into impoundments and management of salt is easier in water than in the soil. Due to high organic input in aquaculture such as feed, manure and continuous deposition of fecal matter, pH of the soil gets reduced and over the period of time such soils can be put back into the original activity. Under National Agricultural Development Program (NADP), the project was implemented in 258 villages of Mandya District, Karnataka State, India and found that these lands can be effectively utilized for fish culture and increase the proteinacious food production by many folds while conserving the soils. The findings of the research can be adopted and up scaled in any country.

Keywords: saline and alkaline soils, Aquaculture, Problematic soils, Reclamation

Procedia PDF Downloads 138
25482 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 59
25481 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 189