Search results for: patch metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 796

Search results for: patch metrics

226 Nanda Ways of Knowing, Being and Doing: Our Process of Research Engagement and Research Impacts

Authors: Steven Kelly

Abstract:

A fundament role of the researcher is research engagement, that is, the interaction between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods, or resources. While research impact is the contribution that research makes to the economy, society, environment, or culture beyond the contribution to academic research. Ironically, traditional impact metrics in the academy are designed to focus on the outputs; it dismisses the important role engagement plays in fostering a collaborative process that leads to meaningful, ethical, and useful impacts. Dr. Kelly, aNanda (First Nations) man himself, has worked closely with the Nanda community over the past decade, ensuring cultural protocols are upheld and implemented while doing research engagement. The focus was on the process, which was essential to foster a positive research impact culture. The contributions that flowed from this process were the naming of a new species of squat lobster in the Nanda language, a poster design in collaboration with The University of Melbourne, Museums Victoria and Bundiyarra - IrraWanga language centre, media coverage, and the formation of the “Nanda language, Nanda country project”. The Nanda language, Nanda country project is a language revitalization project that focused on reconnecting Nanda people with the language & culture on Nanda Country. Such outcomes are imperative on the eve of the United Nations International Decade of Indigenous Languages. In this paperDr, Kellywill discuss howNanda cultural practicesinformed research engagement to foster a collaborative processthat, in turn, ledto meaningful, ethical, and useful impacts within and outside of the academy.

Keywords: community collaboration, indigenous, nanda, research engagement, research impacts

Procedia PDF Downloads 114
225 Biodiversity Affects Bovine Tuberculosis (bTB) Risk in Ethiopian Cattle: Prospects for Infectious Disease Control

Authors: Sintayehu W. Dejene, Ignas M. A. Heitkönig, Herbert H. T. Prins, Zewdu K. Tessema, Willem F. de Boer

Abstract:

Current theories on diversity-disease relationships describe host species diversity and species identity as important factors influencing disease risk, either diluting or amplifying disease prevalence in a community. Whereas the simple term ‘diversity’ embodies a set of animal community characteristics, it is not clear how different measures of species diversity are correlated with disease risk. We, therefore, tested the effects of species richness, Pielou’s evenness and Shannon’s diversity on bTB risk in cattle in the Afar Region and Awash National Park between November 2013 and April 2015. We also analysed the identity effect of a particular species and the effect of host habitat use overlap on bTB risk. We used the comparative intradermal tuberculin test to assess the number of bTB infected cattle. Our results suggested a dilution effect through species evenness. We found that the identity effect of greater kudu - a maintenance host – confounded the dilution effect of species diversity on bTB risk. bTB infection was positively correlated with habitat use overlap between greater kudu and cattle. Different diversity indices have to be considered together for assessing diversity-disease relationships, for understanding the underlying causal mechanisms. We posit that unpacking diversity metrics is also relevant for formulating control strategies to manage cattle in ecosystems characterized by seasonally limited resources and intense wildlife-livestock interactions.

Keywords: evenness, diversity, greater kudu, identity effect, maintenance hosts, multi-host disease ecology, habitat use overlap

Procedia PDF Downloads 331
224 Inferring the Ecological Quality of Seagrass Beds from Using Composition and Configuration Indices

Authors: Fabrice Houngnandan, Celia Fery, Thomas Bockel, Julie Deter

Abstract:

Getting water cleaner and stopping global biodiversity loss requires indices to measure changes and evaluate the achievement of objectives. The endemic and protected seagrass species Posidonia oceanica is a biological indicator used to monitor the ecological quality of marine Mediterranean waters. One ecosystem index (EBQI), two biotic indices (PREI, Bipo), and several landscape indices, which measure the composition and configuration of the P. oceanica seagrass at the population scale have been developed. While the formers are measured at monitoring sites, the landscape indices can be calculated for the entire seabed covered by this ecosystem. This present work aims to search on the link between these indices and the best scale to be used in order to maximize this link. We used data collected between 2014 to 2019 along the French Mediterranean coastline to calculate EBQI, PREI, and Bipo at 100 sites. From the P. oceanica seagrass distribution map, configuration and composition indices around these different sites in 6 different grid sizes (100 m x 100 to 1000 m x 1000 m) were determined. Correlation analyses were first used to find out the grid size presenting the strongest and most significant link between the different types of indices. Finally, several models were compared basis on various metrics to identify the one that best explains the nature of the link between these indices. Our results showed a strong and significant link between biotic indices and the best correlations between biotic and landscape indices within the 600 m x 600 m grid cells. These results showed that the use of landscape indices is possible to monitor the health of seagrass beds at a large scale.

Keywords: ecological indicators, decline, conservation, submerged aquatic vegetation

Procedia PDF Downloads 131
223 Terrestrial Laser Scans to Assess Aerial LiDAR Data

Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani

Abstract:

The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.

Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy

Procedia PDF Downloads 100
222 Assessing the NYC's Single-Family Housing Typology for Urban Heat Vulnerability and Occupants’ Health Risk under the Climate Change Emergency

Authors: Eleni Stefania Kalapoda

Abstract:

Recurring heat waves due to the global climate change emergency pose continuous risks to human health and urban resources. Local and state decision-makers incorporate Heat Vulnerability Indices (HVIs) to quantify and map the relative impact on human health in emergencies. These maps enable government officials to identify the highest-risk districts and to concentrate emergency planning efforts and available resources accordingly (e.g., to reevaluate the location and the number of heat-relief centers). Even though the framework of conducting an HVI is unique per municipality, its accuracy in assessing the heat risk is limited. To resolve this issue, varied housing-related metrics should be included. This paper quantifies and classifies NYC’s single detached housing typology within high-vulnerable NYC districts using detailed energy simulations and post-processing calculations. The results show that the variation in indoor heat risk depends significantly on the dwelling’s design/operation characteristics, concluding that low-ventilated dwellings are the most vulnerable ones. Also, it confirmed that when building-level determinants of exposure are excluded from the assessment, HVI fails to capture important components of heat vulnerability. Lastly, the overall vulnerability ratio of the housing units was calculated between 0.11 to 1.6 indoor heat degrees in terms of ventilation and shading capacity, insulation degree, and other building attributes.

Keywords: heat vulnerability index, energy efficiency, urban heat, resiliency to heat, climate adaptation, climate mitigation, building energy

Procedia PDF Downloads 81
221 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication

Authors: Vedant Janapaty

Abstract:

Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.

Keywords: estuary, remote sensing, machine learning, Fourier transform

Procedia PDF Downloads 104
220 Optical Board as an Artificial Technology for a Peer Teaching Class in a Nigerian University

Authors: Azidah Abu Ziden, Adu Ifedayo Emmanuel

Abstract:

This study investigated the optical board as an artificial technology for peer teaching in a Nigerian university. A design and development research (DDR) design was adopted, which entailed the planning and testing of instructional design models adopted to produce the optical board. This research population involved twenty-five (25) peer-teaching students at a Nigerian university consisting of theatre arts, religion, and language education-related disciplines. Also, using a random sampling technique, this study selected eight (8) students to work on the optical board. Besides, this study introduced a research instrument titled lecturer assessment rubric containing 30-mark metrics for evaluating students’ teaching with the optical board. In this study, it was discovered that the optical board affords students acquisition of self-employment skills through their exposure to the peer teaching course, which is a teacher training module in Nigerian universities. It is evident in this study that students were able to coordinate their design and effectively develop the optical board without lecturer’s interference. This kind of achievement in this research shows that the Nigerian university curriculum had been designed with contents meant to spur students to create jobs after graduation, and effective implementation of the readily available curriculum contents is enough to imbue students with the needed entrepreneurial skills. It was recommended that the Federal Government of Nigeria (FGN) must discourage the poor implementation of Nigerian university curriculum and invest more in the betterment of the readily available curriculum instead of considering a synonymously acclaimed new curriculum for regurgitated teaching and learning process.

Keywords: optical board, artificial technology, peer teaching, educational technology, Nigeria, Malaysia, university, glass, wood, electrical, improvisation

Procedia PDF Downloads 67
219 A Multi Objective Reliable Location-Inventory Capacitated Disruption Facility Problem with Penalty Cost Solve with Efficient Meta Historic Algorithms

Authors: Elham Taghizadeh, Mostafa Abedzadeh, Mostafa Setak

Abstract:

Logistics network is expected that opened facilities work continuously for a long time horizon without any failure; but in real world problems, facilities may face disruptions. This paper studies a reliable joint inventory location problem to optimize cost of facility locations, customers’ assignment, and inventory management decisions when facilities face failure risks and doesn’t work. In our model we assume when a facility is out of work, its customers may be reassigned to other operational facilities otherwise they must endure high penalty costs associated with losing service. For defining the model closer to real world problems, the model is proposed based on p-median problem and the facilities are considered to have limited capacities. We define a new binary variable (Z_is) for showing that customers are not assigned to any facilities. Our problem involve a bi-objective model; the first one minimizes the sum of facility construction costs and expected inventory holding costs, the second one function that mention for the first one is minimizes maximum expected customer costs under normal and failure scenarios. For solving this model we use NSGAII and MOSS algorithms have been applied to find the pareto- archive solution. Also Response Surface Methodology (RSM) is applied for optimizing the NSGAII Algorithm Parameters. We compare performance of two algorithms with three metrics and the results show NSGAII is more suitable for our model.

Keywords: joint inventory-location problem, facility location, NSGAII, MOSS

Procedia PDF Downloads 525
218 Measuring the Influence of Functional Proximity on Environmental Urban Performance via IMM: Four Study Cases in Milan

Authors: Massimo Tadi, M. Hadi Mohammad Zadeh, Ozge Ogut

Abstract:

Although how cities’ forms are structured is studied, more efforts are needed on systemic comprehensions and evaluations of the urban morphology through quantitative metrics that are able to describe the performance of a city in relation to its formal properties. More research is required in this direction in order to better describe the urban form characteristics and their impact on the environmental performance of cities and to increase their sustainability stewardship. With the aim of developing a better understanding of the built environment’s systemic structure, the intention of this paper is to present a holistic methodology for studying the behavior of the built environment and investigate the methods for measuring the effect of urban structure to the environmental performance. This goal will be pursued through an inquiry into the morphological components of the urban systems and the complex relationships between them. Particularly, this paper focuses on proximity, referring to the proximity of different land-uses, is a concept with which Integrated Modification Methodology (IMM) explains how land-use allocation might affect the choice of mobility in neighborhoods, and especially, encourage or discourage non-motived mobility. This paper uses proximity to demonstrate that the structure attributes can quantifiably relate to the performing behavior in the city. The target is to devise a mathematical pattern from the structural elements and correlate it directly with urban performance indicators concerned with environmental sustainability. The paper presents some results of this rigorous investigation of urban proximity and its correlation with performance indicators in four different areas in the city of Milan, each of them characterized by different morphological features.

Keywords: built environment, ecology, sustainable indicators, sustainability, urban morphology

Procedia PDF Downloads 168
217 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning

Authors: Joseph George, Anne Kotteswara Roa

Abstract:

Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.

Keywords: skin cancer, deep learning, performance measures, accuracy, datasets

Procedia PDF Downloads 128
216 ALEF: An Enhanced Approach to Arabic-English Bilingual Translation

Authors: Abdul Muqsit Abbasi, Ibrahim Chhipa, Asad Anwer, Saad Farooq, Hassan Berry, Sonu Kumar, Sundar Ali, Muhammad Owais Mahmood, Areeb Ur Rehman, Bahram Baloch

Abstract:

Accurate translation between structurally diverse languages, such as Arabic and English, presents a critical challenge in natural language processing due to significant linguistic and cultural differences. This paper investigates the effectiveness of Facebook’s mBART model, fine-tuned specifically for sequence-tosequence (seq2seq) translation tasks between Arabic and English, and enhanced through advanced refinement techniques. Our approach leverages the Alef Dataset, a meticulously curated parallel corpus spanning various domains to capture the linguistic richness, nuances, and contextual accuracy essential for high-quality translation. We further refine the model’s output using advanced language models such as GPT-3.5 and GPT-4, which improve fluency, coherence, and correct grammatical errors in translated texts. The fine-tuned model demonstrates substantial improvements, achieving a BLEU score of 38.97, METEOR score of 58.11, and TER score of 56.33, surpassing widely used systems such as Google Translate. These results underscore the potential of mBART, combined with refinement strategies, to bridge the translation gap between Arabic and English, providing a reliable, context-aware machine translation solution that is robust across diverse linguistic contexts.

Keywords: natural language processing, machine translation, fine-tuning, Arabic-English translation, transformer models, seq2seq translation, translation evaluation metrics, cross-linguistic communication

Procedia PDF Downloads 7
215 Influence of Travel Time Reliability on Elderly Drivers Crash Severity

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.

Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling

Procedia PDF Downloads 493
214 Tractography Analysis of the Evolutionary Origin of Schizophrenia

Authors: Asmaa Tahiri, Mouktafi Amine

Abstract:

A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that fundamental understanding of the underlying causes of the majority psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics; Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).

Keywords: tractography, evolutionary psychology, schizophrenia, brain connectivity

Procedia PDF Downloads 71
213 Ultrasound-Assisted Extraction of Bioactive Compounds from Cocoa Shell and Their Encapsulation in Gum Arabic and Maltodextrin: A Technology to Produce Functional Food Ingredients

Authors: Saeid Jafari, Khursheed Ahmad Sheikh, Randy W. Worobo, Kitipong Assatarakul

Abstract:

In this study, the extraction of cocoa shell powder (CSP) was optimized, and the optimized extracts were spray-dried for encapsulation purposes. Temperature (45-65 ◦C), extraction time (30–60 min), and ethanol concentration (60–100%) were the extraction parameters. The response surface methodology analysis revealed that the model was significant (p ≤ 0.05) in interactions between all variables (total phenolic compound, total flavonoid content, and antioxidant activity as measured by 2,2-Diphenyl-1-picrylhydrazyl (DPPH) and ferric reducing antioxidant power (FRAP assays), with a lack of fit test for the model being insignificant (p > 0.05). Temperature (55 ◦C), time (45 min), and ethanol concentration (60%) were found to be the optimal extraction conditions. For spray-drying encapsulation, some quality metrics (e.g., water solubility, water activity) were insignificant (p > 0.05). The microcapsules were found to be spherical in shape using a scanning electron microscope. Thermogravimetric and differential thermogravimetric measurements of the microcapsules revealed nearly identical results. The gum arabic + maltodextrin microcapsule (GMM) showed potential antibacterial (zone of inhibition: 11.50 mm; lower minimum inhibitory concentration: 1.50 mg/mL) and antioxidant (DPPH: 1063 mM trolox/100g dry wt.) activities (p ≤ 0.05). In conclusion, the microcapsules in this study, particularly GMM, are promising antioxidant and antibacterial agents to be fortified as functional food ingredients for the production of nutraceutical foods with health-promoting properties.

Keywords: functional foods, coco shell powder, antioxidant activity, encapsulation, extraction

Procedia PDF Downloads 57
212 Leveraging Mobile Apps for Citizen-Centric Urban Planning: Insights from Tajawob Implementation

Authors: Alae El Fahsi

Abstract:

This study explores the ‘Tajawob’ app's role in urban development, demonstrating how mobile applications can empower citizens and facilitate urban planning. Tajawob serves as a digital platform for community feedback, engagement, and participatory governance, addressing urban challenges through innovative tech solutions. This research synthesizes data from a variety of sources, including user feedback, engagement metrics, and interviews with city officials, to assess the app’s impact on citizen participation in urban development in Morocco. By integrating advanced data analytics and user experience design, Tajawob has bridged the communication gap between citizens and government officials, fostering a more collaborative and transparent urban planning process. The findings reveal a significant increase in civic engagement, with users actively contributing to urban management decisions, thereby enhancing the responsiveness and inclusivity of urban governance. Challenges such as digital literacy, infrastructure limitations, and privacy concerns are also discussed, providing a comprehensive overview of the obstacles and opportunities presented by mobile app-based citizen engagement platforms. The study concludes with strategic recommendations for scaling the Tajawob model to other contexts, emphasizing the importance of adaptive technology solutions in meeting the evolving needs of urban populations. This research contributes to the burgeoning field of smart city innovations, offering key insights into the role of digital tools in facilitating more democratic and participatory urban environments.

Keywords: smart cities, digital governance, urban planning, strategic design

Procedia PDF Downloads 58
211 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach

Authors: M. Taheri Tehrani, H. Ajorloo

Abstract:

In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.

Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems

Procedia PDF Downloads 518
210 Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks

Authors: Gunasekaran Raja, Ramkumar Jayaraman

Abstract:

In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.

Keywords: cross layer network topology, concurrent scheduling, modularity value, network communities and weighted load balancing

Procedia PDF Downloads 265
209 Comparison of Different Machine Learning Algorithms for Solubility Prediction

Authors: Muhammet Baldan, Emel Timuçin

Abstract:

Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.

Keywords: random forest, machine learning, comparison, feature extraction

Procedia PDF Downloads 40
208 Assessment of Mountain Hydrological Processes in the Gumera Catchment, Ethiopia

Authors: Tewele Gebretsadkan Haile

Abstract:

Mountain terrains are essential to regional water resources by regulating hydrological processes that use downstream water supplies. Nevertheless, limited observed earth data in complex topography poses challenges for water resources regulation. That's why satellite product is implemented in this study. This study evaluates hydrological processes on mountain catchment of Gumera, Ethiopia using HBV-light model with satellite precipitation products (CHIRPS) for the temporal scale of 1996 to 2010 and area coverage of 1289 km2. The catchment is characterized by cultivation dominant and elevation ranges from 1788 to 3606 m above sea level. Three meteorological stations have been used for downscaling of the satellite data and one stream flow for calibration and validation. The result shows total annual water balance showed that precipitation 1410 mm, simulated 828 mm surface runoff compared to 1042 mm observed stream flow with actual evapotranspiration estimate 586mm and 1495mm potential evapotranspiration. The temperature range is 9°C in winter to 21°C. The catchment contributes 74% as quack runoff to the total runoff and 26% as lower groundwater storage, which sustains stream flow during low periods. The model uncertainty was measured using different metrics such as coefficient of determination, model efficiency, efficiency for log(Q) and flow weighted efficiency 0.76, 0.74, 0.66 and 0.70 respectively. The research result highlights that HBV model captures the mountain hydrology simulation and the result indicates quack runoff due to the traditional agricultural system, slope factor of the topography and adaptation measure for water resource management is recommended.

Keywords: mountain hydrology, CHIRPS, Gumera, HBV model

Procedia PDF Downloads 11
207 Quantification of the Gumera Catchment's Mountain Hydrological Processes in Ethiopia

Authors: Tewele Gebretsadkan Haile

Abstract:

Mountain terrains are essential to regional water resources by regulating hydrological processes that use downstream water supplies. Nevertheless, limited observed earth data in complex topography poses challenges for water resources regulation. That's why satellite product is implemented in this study. This study evaluates hydrological processes on mountain catchment of Gumera, Ethiopia using HBV-light model with satellite precipitation products (CHIRPS) for the temporal scale of 1996 to 2010 and area coverage of 1289 km2. The catchment is characterized by cultivation dominant and elevation ranges from 1788 to 3606 m above sea level. Three meteorological stations have been used for downscaling of the satellite data and one stream flow for calibration and validation. The result shows total annual water balance showed that precipitation 1410 mm, simulated 828 mm surface runoff compared to 1042 mm observed stream flow with actual evapotranspiration estimate 586mm and 1495mm potential evapotranspiration. The temperature range is 9°C in winter to 21°C. The catchment contributes 74% as quack runoff to the total runoff and 26% as lower groundwater storage, which sustains stream flow during low periods. The model uncertainty was measured using different metrics such as coefficient of determination, model efficiency, efficiency for log(Q) and flow weighted efficiency 0.76, 0.74, 0.66 and 0.70 respectively. The research result highlights that HBV model captures the mountain hydrology simulation and the result indicates quack runoff due to the traditional agricultural system, slope factor of the topography and adaptation measure for water resource management is recommended.

Keywords: mountain hydrology, CHIRPS, HBV model, Gumera

Procedia PDF Downloads 10
206 A Comparative Time-Series Analysis and Deep Learning Projection of Innate Radon Gas Risk in Canadian and Swedish Residential Buildings

Authors: Selim M. Khan, Dustin D. Pearson, Tryggve Rönnqvist, Markus E. Nielsen, Joshua M. Taron, Aaron A. Goodarzi

Abstract:

Accumulation of radioactive radon gas in indoor air poses a serious risk to human health by increasing the lifetime risk of lung cancer and is classified by IARC as a category one carcinogen. Radon exposure risks are a function of geologic, geographic, design, and human behavioural variables and can change over time. Using time series and deep machine learning modelling, we analyzed long-term radon test outcomes as a function of building metrics from 25,489 Canadian and 38,596 Swedish residential properties constructed between 1945 to 2020. While Canadian and Swedish properties built between 1970 and 1980 are comparable (96–103 Bq/m³), innate radon risks subsequently diverge, rising in Canada and falling in Sweden such that 21st Century Canadian houses show 467% greater average radon (131 Bq/m³) relative to Swedish equivalents (28 Bq/m³). These trends are consistent across housing types and regions within each country. The introduction of energy efficiency measures within Canadian and Swedish building codes coincided with opposing radon level trajectories in each nation. Deep machine learning modelling predicts that, without intervention, average Canadian residential radon levels will increase to 176 Bq/m³ by 2050, emphasizing the importance and urgency of future building code intervention to achieve systemic radon reduction in Canada.

Keywords: radon health risk, time-series, deep machine learning, lung cancer, Canada, Sweden

Procedia PDF Downloads 85
205 Predicting Emerging Agricultural Investment Opportunities: The Potential of Structural Evolution Index

Authors: Kwaku Damoah

Abstract:

The agricultural sector is characterized by continuous transformation, driven by factors such as demographic shifts, evolving consumer preferences, climate change, and migration trends. This dynamic environment presents complex challenges for key stakeholders including farmers, governments, and investors, who must navigate these changes to achieve optimal investment returns. To effectively predict market trends and uncover promising investment opportunities, a systematic, data-driven approach is essential. This paper introduces the Structural Evolution Index (SEI), a machine learning-based methodology. SEI is specifically designed to analyse long-term trends and forecast the potential of emerging agricultural products for investment. Versatile in application, it evaluates various agricultural metrics such as production, yield, trade, land use, and consumption, providing a comprehensive view of the evolution within agricultural markets. By harnessing data from the UN Food and Agricultural Organisation (FAOSTAT), this study demonstrates the SEI's capabilities through Comparative Exploratory Analysis and evaluation of international trade in agricultural products, focusing on Malaysia and Singapore. The SEI methodology reveals intricate patterns and transitions within the agricultural sector, enabling stakeholders to strategically identify and capitalize on emerging markets. This predictive framework is a powerful tool for decision-makers, offering crucial insights that help anticipate market shifts and align investments with anticipated returns.

Keywords: agricultural investment, algorithm, comparative exploratory analytics, machine learning, market trends, predictive analytics, structural evolution index

Procedia PDF Downloads 63
204 A Hebbian Neural Network Model of the Stroop Effect

Authors: Vadim Kulikov

Abstract:

The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.

Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop

Procedia PDF Downloads 264
203 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees

Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.

Keywords: cloud storage, decision trees, diagnostic image, search, telemedicine

Procedia PDF Downloads 204
202 Virtual Team Management in Companies and Organizations

Authors: Asghar Zamani, Mostafa Falahmorad

Abstract:

Virtualization is established to combine and use the unique capabilities of employees to increase productivity and agility to provide services regardless of location. Adapting to fast and continuous change and getting maximum access to human resources are reasons why virtualization is happening. The distance problem is solved by information. Flexibility is the most important feature of virtualization, and information will be the main focus of virtualized companies. In this research, we used the Covid-19 opportunity window to assess the productivity of the companies that had been going through more virtualized management before the Covid-19 in comparison with those that just started planning on developing infrastructures on virtual management after the crises of pandemic occurred. The research process includes financial (profitability and customer satisfaction) and behavioral (organizational culture and reluctance to change) metrics assessment. In addition to financial and CRM KPIs, a questionnaire is devised to assess how manager and employees’ attitude has been changing towards the migration to virtualization. The sample companies and questions are selected by asking from experts in the IT industry of Iran. In this article, the conclusion is that companies open to virtualization based on accurate strategic planning or willing to pay to train their employees for virtualization before the pandemic are more agile in adapting to change and moving forward in recession. The prospective companies in this research, not only could compensate for the short period loss from the first shock of the Covid-19, but they could also foresee new needs of their customer sooner than other competitors, resulting in the need to employ new staff for executing the emerging demands. Findings were aligned with the literature review. Results can be a wake-up call for business owners especially in developing countries to be more resilient toward modern management styles instead of continuing with traditional ones.

Keywords: virtual management, virtual organization, competitive advantage, KPI, profit

Procedia PDF Downloads 83
201 Tractography Analysis and the Evolutionary Origin of Schizophrenia

Authors: Mouktafi Amine, Tahiri Asmaa

Abstract:

A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that a fundamental understanding of the underlying causes of the majority of psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans' higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature, the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions' connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics: Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).

Keywords: tractography, diffusion weighted imaging, schizophrenia, evolutionary psychology

Procedia PDF Downloads 49
200 Chloride Ion Channels Play a Role in Mediating Immune Response during Pseudomonas aeruginosa Infection

Authors: Hani M. Alothaid, Louise Robson, Richmond Muimo

Abstract:

Cystic fibrosis (CF) is a disease that affects respiratory function and in EU it affects about 1 in 2,500 live births with an average 40-year life expectancy. This disease caused by mutations within the gene encoding the CFTR (Cystic Fibrosis Transmembrane Conductance Regulator) chloride channel leading to dysregulation of epithelial fluid transport and chronic lung inflammation, suggesting functional alterations of immune cells. In airways, CFTR been found to form a functional complex with S100A10 and AnxA2 in a cAMP/PKA dependent manner. The multiprotein complex of AnxA2-S100A10 and CFTR is also regulated by calcineurin. The aim of this study was i) to investigate whether chloride ion (Cl−) channels are activated by Pseudomonas aeruginosa lipopolysaccharide (LPS from PA), ii) if this activation is regulated by cAMP/PKA/calcineurin pathway and iii) to investigate the role of LPS-activated Cl− channels in the release of pro-inflammatory cytokines by immune cells. Human peripheral blood monocytes were used in the study. Whole-cell patch records showed that LPS from PA can activate Cl− channels, including CFTR and outwardly-rectifying Cl− channel (ORCC). This activation appears to require an intact PKA/calcineurin signalling pathway. The Gout in the presence of LPS was significantly inhibited by diisothiocyanatostilbene-disulfonic acid (DIDS), an ORCC blocker (p<0.001). The Gout was further suppressed by CFTR(inh)-172, a specific inhibitor for CFTR channels (p<0.001). Monocytes pre-incubated with PKA inhibitor or calcineurin inhibitor before stimulated with LPS from PA that were resulted in DIDS and CFTR(inh)-172 insensitive currents. Activation of both ORCC and CFTR was however, observed in response to monocytes exposure to LPS. Additionally, ELISA showed that the CFTR and ORCC play a role in mediating the release of pro-inflammatory cytokines such as IL-1β upon exposure of monocytes to LPS. However, this secretion was significantly inhibited due to CFTR and ORCC inhibition. However, Cl− may play a role in IL-1β release independent of cAMP/PKA/calcineurin signalling due to the enhancement of IL-1β secretion even when cAMP/PKA/calcineurin pathway was inhibited. In conclusion, our data confirmed that LPS from PA activates Cl− channels in human peripheral blood monocytes. Our data also confirmed that Cl− channels were involved in IL-1β release in monocytes upon exposure to LPS. However, it has been found that PKA and calcineurin does not seem to influence the Cl− dependent cytokine release.

Keywords: cystic fibrosis, CFTR, Annexin A2, S100A10, PP2B, PKA, outwardly-rectifying Cl− channel, Pseudomonas aeruginosa

Procedia PDF Downloads 177
199 EcoMush: Mapping Sustainable Mushroom Production in Bangladesh

Authors: A. A. Sadia, A. Emdad, E. Hossain

Abstract:

The increasing importance of mushrooms as a source of nutrition, health benefits, and even potential cancer treatment has raised awareness of the impact of climate-sensitive variables on their cultivation. Factors like temperature, relative humidity, air quality, and substrate composition play pivotal roles in shaping mushroom growth, especially in Bangladesh. Oyster mushrooms, a commonly cultivated variety in this region, are particularly vulnerable to climate fluctuations. This research explores the climatic dynamics affecting oyster mushroom cultivation and, presents an approach to address these challenges and provides tangible solutions to fortify the agro-economy, ensure food security, and promote the sustainability of this crucial food source. Using climate and production data, this study evaluates the performance of three clustering algorithms -KMeans, OPTICS, and BIRCH- based on various quality metrics. While each algorithm demonstrates specific strengths, the findings provide insights into their effectiveness for this specific dataset. The results yield essential information, pinpointing the optimal temperature range of 13°C-22°C, the unfavorable temperature threshold of 28°C and above, and the ideal relative humidity range of 75-85% with the suitable production regions in three different seasons: Kharif-1, 2, and Robi. Additionally, a user-friendly web application is developed to support mushroom farmers in making well-informed decisions about their cultivation practices. This platform offers valuable insights into the most advantageous periods for oyster mushroom farming, with the overarching goal of enhancing the efficiency and profitability of mushroom farming.

Keywords: climate variability, mushroom cultivation, clustering techniques, food security, sustainability, web-application

Procedia PDF Downloads 68
198 Diabetes Mellitus and Blood Glucose Variability Increases the 30-day Readmission Rate after Kidney Transplantation

Authors: Harini Chakkera

Abstract:

Background: Inpatient hyperglycemia is an established independent risk factor among several patient cohorts with hospital readmission. This has not been studied after kidney transplantation. Nearly one-third of patients who have undergone a kidney transplant reportedly experience 30-day readmission. Methods: Data on first-time solitary kidney transplantations were retrieved between September 2015 to December 2018. Information was linked to the electronic health record to determine a diagnosis of diabetes mellitus and extract glucometeric and insulin therapy data. Univariate logistic regression analysis and the XGBoost algorithm were used to predict 30-day readmission. We report the average performance of the models on the testing set on five bootstrapped partitions of the data to ensure statistical significance. Results: The cohort included 1036 patients who received kidney transplantation, and 224 (22%) experienced 30-day readmission. The machine learning algorithm was able to predict 30-day readmission with an average AUC of 77.3% (95% CI 75.30-79.3%). We observed statistically significant differences in the presence of pretransplant diabetes, inpatient-hyperglycemia, inpatient-hypoglycemia, and minimum and maximum glucose values among those with higher 30-day readmission rates. The XGBoost model identified the index admission length of stay, presence of hyper- and hypoglycemia and recipient and donor BMI values as the most predictive risk factors of 30-day readmission. Additionally, significant variations in the therapeutic management of blood glucose by providers were observed. Conclusions: Suboptimal glucose metrics during hospitalization after kidney transplantation is associated with an increased risk for 30-day hospital readmission. Optimizing the hospital blood glucose management, a modifiable factor, after kidney transplantation may reduce the risk of 30-day readmission.

Keywords: kidney, transplant, diabetes, insulin

Procedia PDF Downloads 90
197 Remote Sensing through Deep Neural Networks for Satellite Image Classification

Authors: Teja Sai Puligadda

Abstract:

Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.

Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss

Procedia PDF Downloads 159