Search results for: metrics
224 Performance Evaluation of Parallel Surface Modeling and Generation on Actual and Virtual Multicore Systems
Authors: Nyeng P. Gyang
Abstract:
Even though past, current and future trends suggest that multicore and cloud computing systems are increasingly prevalent/ubiquitous, this class of parallel systems is nonetheless underutilized, in general, and barely used for research on employing parallel Delaunay triangulation for parallel surface modeling and generation, in particular. The performances, of actual/physical and virtual/cloud multicore systems/machines, at executing various algorithms, which implement various parallelization strategies of the incremental insertion technique of the Delaunay triangulation algorithm, were evaluated. T-tests were run on the data collected, in order to determine whether various performance metrics differences (including execution time, speedup and efficiency) were statistically significant. Results show that the actual machine is approximately twice faster than the virtual machine at executing the same programs for the various parallelization strategies. Results, which furnish the scalability behaviors of the various parallelization strategies, also show that some of the differences between the performances of these systems, during different runs of the algorithms on the systems, were statistically significant. A few pseudo superlinear speedup results, which were computed from the raw data collected, are not true superlinear speedup values. These pseudo superlinear speedup values, which arise as a result of one way of computing speedups, disappear and give way to asymmetric speedups, which are the accurate kind of speedups that occur in the experiments performed.Keywords: cloud computing systems, multicore systems, parallel Delaunay triangulation, parallel surface modeling and generation
Procedia PDF Downloads 206223 From Comfort to Safety: Assessing the Influence of Car Seat Design on Driver Reaction and Performance
Authors: Sabariah Mohd Yusoff, Qamaruddin Adzeem Muhamad Murad
Abstract:
This study investigates the impact of car seat design on driver response time, addressing a critical gap in understanding how ergonomic features influence both performance and safety. Controlled driving experiments were conducted with fourteen participants (11 male, 3 female) across three locations chosen for their varying traffic conditions to account for differences in driver alertness. Participants interacted with various seat designs while performing driving tasks, and objective metrics such as braking and steering response times were meticulously recorded. Advanced statistical methods, including regression analysis and t-tests, were employed to identify design factors that significantly affect driver response times. Subjective feedback was gathered through detailed questionnaires—focused on driving experience and knowledge of response time—and in-depth interviews. This qualitative data was analyzed thematically to provide insights into driver comfort and usability preferences. The study aims to identify key seat design features that impact driver response time and to gain a deeper understanding of driver preferences for comfort and usability. The findings are expected to inform evidence-based guidelines for optimizing car seat design, ultimately enhancing driver performance and safety. The research offers valuable implications for automotive manufacturers and designers, contributing to the development of seats that improve driver response time and overall driving safety.Keywords: car seat design, driver response time, cognitive driving, ergonomics optimization
Procedia PDF Downloads 24222 Effect of Dual Wavelength Light Exposure on Regeneration of Dugesia dorotocephala
Authors: Zayedali Shaikh
Abstract:
Increasingly now more than ever, UV damage brings with it a litany of minor deformities that can range from mild lesions and discoloring to cataracts and blindness. Pluripotent stem cells in planaria and human skin can be used to treat wounds and skin damage, with the primary limitations being inadequate growth factors. Photobiomodulation therapy in the form of low-intensity red light therapy has been proven to provide helpful benefits in the healing of skin that displays some of the symptoms of UV damage, such as burns and lesions, along with stimulating the proliferation of stem cells in recellularizing tissue. This paper puts forth an alternate means by which to treat the effects of UV damage using the freshwater planarian model system, Dugesia dorotocephala, known for its regenerative abilities and abundance of pluripotent stem cells, which allow for the rapid growth and repair of missing or damaged structures. Our work consisted of exposing planaria to different types of light: red light, blue light, white light, darkness, red and blue light together, UV light, and finally, red and UV light together. The primary focus of this research was on the red and UV lights, with six controls acting as metrics to compare our findings. Through computer-assisted morphological analysis, the results show that there is no significant difference in the rates of regeneration of planaria treated with simultaneous exposure to red and UV light versus planaria in darkness (p > .05), a representation of their preferred natural habitat. Our research suggests the viability of red-light therapy in actively combating UV damage and expediting the growth of epidermal stem cells by acting as another growth factor.Keywords: regenerative medicine, stem cells, planaria, photobiomodulation
Procedia PDF Downloads 77221 Using Environmental Life Cycle Assessment to Design Sustainable Packaging
Authors: Timothy Francis Grant
Abstract:
There are conflicting purposes at play with the design of sustainable packaging which include material reduction, recycling compatibility, use of secondary content and performance of the package in protecting and delivering the product. Life Cycle Assessment (LCA) is able to evaluate these different strategies against environmental metrics such as climate change, land and water use and marine litter pollution. However, LCA has traditionally been too time consuming and expensive to be used effectively in packaging design process. To make LCA practical for packaging technologist and designers a simplified tool is needed to make LCA possible for non-environmental specialists. The Packaging Quick Evaluation Tool (PIQET) is a web-based solution for undertaking LCA of new and existing packaging designs considering the global supply chain and impacts from cradle to grave. PIQET is based on a pre-calculated LCA database covering the materials and processes involved in the packaging lifecycle from cradle to grave. This includes both virgin materials and recycled content, conversion of materials into packaging, and the transportation of packaging to the product filling. In addition, PIQET assesses the impacts once the package is filled looking at storage, transport and product loss through the supply chain. When applied to consumer packaging light weight packages which are note recyclable have lower impacts than more recyclable packages which have a higher mass. Its also apparent that for many products the impacts of product failure and product loss are more important environmentally compared to packaging material efficiency.Keywords: Climate change, Life Cycle Assessment, Marine litter, Packaging sustainability
Procedia PDF Downloads 133220 Identification of Indices to Quantify Gentrification
Authors: Sophy Ann Xavier, Lakshmi A
Abstract:
Gentrification is the process of altering a neighborhood's character through the influx of wealthier people and establishments. This idea has subsequently been expanded to encompass brand-new, high-status construction projects that involve regenerating brownfield sites or demolishing and rebuilding residential neighborhoods. Inequality is made worse by Gentrification in ways that go beyond socioeconomic position. The elderly, members of racial and ethnic minorities, individuals with disabilities, and mental health all suffer disproportionately when they are displaced. Cities must cultivate openness, diversity, and inclusion in their collaborations, as well as cooperation on objectives and results. The papers compiled in this issue concentrate on the new gentrification discussions, the rising residential allure of central cities, and the indices to measure this process according to its various varieties. The study makes an effort to fill the research gap in the area of gentrification studies, which is the absence of a set of indices for measuring Gentrification in a specific area. Studies on Gentrification that contain maps of historical change highlight trends that will aid in the production of displacement risk maps, which will guide future interventions by allowing residents and policymakers to extrapolate into the future. Additionally, these maps give locals a glimpse into the future of their communities and serve as a political call to action in areas where residents are expected to be displaced. This study intends to pinpoint metrics and approaches for measuring Gentrification that can then be applied to create a spatiotemporal map of a region and tactics for its inclusive planning. An understanding of various approaches will enable planners and policymakers to select the best approach and create the appropriate plans.Keywords: gentrification, indices, methods, quantification
Procedia PDF Downloads 76219 Machine Learning Classification of Fused Sentinel-1 and Sentinel-2 Image Data Towards Mapping Fruit Plantations in Highly Heterogenous Landscapes
Authors: Yingisani Chabalala, Elhadi Adam, Khalid Adem Ali
Abstract:
Mapping smallholder fruit plantations using optical data is challenging due to morphological landscape heterogeneity and crop types having overlapped spectral signatures. Furthermore, cloud covers limit the use of optical sensing, especially in subtropical climates where they are persistent. This research assessed the effectiveness of Sentinel-1 (S1) and Sentinel-2 (S2) data for mapping fruit trees and co-existing land-use types by using support vector machine (SVM) and random forest (RF) classifiers independently. These classifiers were also applied to fused data from the two sensors. Feature ranks were extracted using the RF mean decrease accuracy (MDA) and forward variable selection (FVS) to identify optimal spectral windows to classify fruit trees. Based on RF MDA and FVS, the SVM classifier resulted in relatively high classification accuracy with overall accuracy (OA) = 0.91.6% and kappa coefficient = 0.91% when applied to the fused satellite data. Application of SVM to S1, S2, S2 selected variables and S1S2 fusion independently produced OA = 27.64, Kappa coefficient = 0.13%; OA= 87%, Kappa coefficient = 86.89%; OA = 69.33, Kappa coefficient = 69. %; OA = 87.01%, Kappa coefficient = 87%, respectively. Results also indicated that the optimal spectral bands for fruit tree mapping are green (B3) and SWIR_2 (B10) for S2, whereas for S1, the vertical-horizontal (VH) polarization band. Including the textural metrics from the VV channel improved crop discrimination and co-existing land use cover types. The fusion approach proved robust and well-suited for accurate smallholder fruit plantation mapping.Keywords: smallholder agriculture, fruit trees, data fusion, precision agriculture
Procedia PDF Downloads 56218 Load Balancing Technique for Energy - Efficiency in Cloud Computing
Authors: Rani Danavath, V. B. Narsimha
Abstract:
Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission
Procedia PDF Downloads 449217 Nanda Ways of Knowing, Being and Doing: Our Process of Research Engagement and Research Impacts
Authors: Steven Kelly
Abstract:
A fundament role of the researcher is research engagement, that is, the interaction between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods, or resources. While research impact is the contribution that research makes to the economy, society, environment, or culture beyond the contribution to academic research. Ironically, traditional impact metrics in the academy are designed to focus on the outputs; it dismisses the important role engagement plays in fostering a collaborative process that leads to meaningful, ethical, and useful impacts. Dr. Kelly, aNanda (First Nations) man himself, has worked closely with the Nanda community over the past decade, ensuring cultural protocols are upheld and implemented while doing research engagement. The focus was on the process, which was essential to foster a positive research impact culture. The contributions that flowed from this process were the naming of a new species of squat lobster in the Nanda language, a poster design in collaboration with The University of Melbourne, Museums Victoria and Bundiyarra - IrraWanga language centre, media coverage, and the formation of the “Nanda language, Nanda country project”. The Nanda language, Nanda country project is a language revitalization project that focused on reconnecting Nanda people with the language & culture on Nanda Country. Such outcomes are imperative on the eve of the United Nations International Decade of Indigenous Languages. In this paperDr, Kellywill discuss howNanda cultural practicesinformed research engagement to foster a collaborative processthat, in turn, ledto meaningful, ethical, and useful impacts within and outside of the academy.Keywords: community collaboration, indigenous, nanda, research engagement, research impacts
Procedia PDF Downloads 114216 Biodiversity Affects Bovine Tuberculosis (bTB) Risk in Ethiopian Cattle: Prospects for Infectious Disease Control
Authors: Sintayehu W. Dejene, Ignas M. A. Heitkönig, Herbert H. T. Prins, Zewdu K. Tessema, Willem F. de Boer
Abstract:
Current theories on diversity-disease relationships describe host species diversity and species identity as important factors influencing disease risk, either diluting or amplifying disease prevalence in a community. Whereas the simple term ‘diversity’ embodies a set of animal community characteristics, it is not clear how different measures of species diversity are correlated with disease risk. We, therefore, tested the effects of species richness, Pielou’s evenness and Shannon’s diversity on bTB risk in cattle in the Afar Region and Awash National Park between November 2013 and April 2015. We also analysed the identity effect of a particular species and the effect of host habitat use overlap on bTB risk. We used the comparative intradermal tuberculin test to assess the number of bTB infected cattle. Our results suggested a dilution effect through species evenness. We found that the identity effect of greater kudu - a maintenance host – confounded the dilution effect of species diversity on bTB risk. bTB infection was positively correlated with habitat use overlap between greater kudu and cattle. Different diversity indices have to be considered together for assessing diversity-disease relationships, for understanding the underlying causal mechanisms. We posit that unpacking diversity metrics is also relevant for formulating control strategies to manage cattle in ecosystems characterized by seasonally limited resources and intense wildlife-livestock interactions.Keywords: evenness, diversity, greater kudu, identity effect, maintenance hosts, multi-host disease ecology, habitat use overlap
Procedia PDF Downloads 332215 Inferring the Ecological Quality of Seagrass Beds from Using Composition and Configuration Indices
Authors: Fabrice Houngnandan, Celia Fery, Thomas Bockel, Julie Deter
Abstract:
Getting water cleaner and stopping global biodiversity loss requires indices to measure changes and evaluate the achievement of objectives. The endemic and protected seagrass species Posidonia oceanica is a biological indicator used to monitor the ecological quality of marine Mediterranean waters. One ecosystem index (EBQI), two biotic indices (PREI, Bipo), and several landscape indices, which measure the composition and configuration of the P. oceanica seagrass at the population scale have been developed. While the formers are measured at monitoring sites, the landscape indices can be calculated for the entire seabed covered by this ecosystem. This present work aims to search on the link between these indices and the best scale to be used in order to maximize this link. We used data collected between 2014 to 2019 along the French Mediterranean coastline to calculate EBQI, PREI, and Bipo at 100 sites. From the P. oceanica seagrass distribution map, configuration and composition indices around these different sites in 6 different grid sizes (100 m x 100 to 1000 m x 1000 m) were determined. Correlation analyses were first used to find out the grid size presenting the strongest and most significant link between the different types of indices. Finally, several models were compared basis on various metrics to identify the one that best explains the nature of the link between these indices. Our results showed a strong and significant link between biotic indices and the best correlations between biotic and landscape indices within the 600 m x 600 m grid cells. These results showed that the use of landscape indices is possible to monitor the health of seagrass beds at a large scale.Keywords: ecological indicators, decline, conservation, submerged aquatic vegetation
Procedia PDF Downloads 131214 Assessing the NYC's Single-Family Housing Typology for Urban Heat Vulnerability and Occupants’ Health Risk under the Climate Change Emergency
Authors: Eleni Stefania Kalapoda
Abstract:
Recurring heat waves due to the global climate change emergency pose continuous risks to human health and urban resources. Local and state decision-makers incorporate Heat Vulnerability Indices (HVIs) to quantify and map the relative impact on human health in emergencies. These maps enable government officials to identify the highest-risk districts and to concentrate emergency planning efforts and available resources accordingly (e.g., to reevaluate the location and the number of heat-relief centers). Even though the framework of conducting an HVI is unique per municipality, its accuracy in assessing the heat risk is limited. To resolve this issue, varied housing-related metrics should be included. This paper quantifies and classifies NYC’s single detached housing typology within high-vulnerable NYC districts using detailed energy simulations and post-processing calculations. The results show that the variation in indoor heat risk depends significantly on the dwelling’s design/operation characteristics, concluding that low-ventilated dwellings are the most vulnerable ones. Also, it confirmed that when building-level determinants of exposure are excluded from the assessment, HVI fails to capture important components of heat vulnerability. Lastly, the overall vulnerability ratio of the housing units was calculated between 0.11 to 1.6 indoor heat degrees in terms of ventilation and shading capacity, insulation degree, and other building attributes.Keywords: heat vulnerability index, energy efficiency, urban heat, resiliency to heat, climate adaptation, climate mitigation, building energy
Procedia PDF Downloads 81213 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 104212 Optical Board as an Artificial Technology for a Peer Teaching Class in a Nigerian University
Authors: Azidah Abu Ziden, Adu Ifedayo Emmanuel
Abstract:
This study investigated the optical board as an artificial technology for peer teaching in a Nigerian university. A design and development research (DDR) design was adopted, which entailed the planning and testing of instructional design models adopted to produce the optical board. This research population involved twenty-five (25) peer-teaching students at a Nigerian university consisting of theatre arts, religion, and language education-related disciplines. Also, using a random sampling technique, this study selected eight (8) students to work on the optical board. Besides, this study introduced a research instrument titled lecturer assessment rubric containing 30-mark metrics for evaluating students’ teaching with the optical board. In this study, it was discovered that the optical board affords students acquisition of self-employment skills through their exposure to the peer teaching course, which is a teacher training module in Nigerian universities. It is evident in this study that students were able to coordinate their design and effectively develop the optical board without lecturer’s interference. This kind of achievement in this research shows that the Nigerian university curriculum had been designed with contents meant to spur students to create jobs after graduation, and effective implementation of the readily available curriculum contents is enough to imbue students with the needed entrepreneurial skills. It was recommended that the Federal Government of Nigeria (FGN) must discourage the poor implementation of Nigerian university curriculum and invest more in the betterment of the readily available curriculum instead of considering a synonymously acclaimed new curriculum for regurgitated teaching and learning process.Keywords: optical board, artificial technology, peer teaching, educational technology, Nigeria, Malaysia, university, glass, wood, electrical, improvisation
Procedia PDF Downloads 68211 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 47210 A Multi Objective Reliable Location-Inventory Capacitated Disruption Facility Problem with Penalty Cost Solve with Efficient Meta Historic Algorithms
Authors: Elham Taghizadeh, Mostafa Abedzadeh, Mostafa Setak
Abstract:
Logistics network is expected that opened facilities work continuously for a long time horizon without any failure; but in real world problems, facilities may face disruptions. This paper studies a reliable joint inventory location problem to optimize cost of facility locations, customers’ assignment, and inventory management decisions when facilities face failure risks and doesn’t work. In our model we assume when a facility is out of work, its customers may be reassigned to other operational facilities otherwise they must endure high penalty costs associated with losing service. For defining the model closer to real world problems, the model is proposed based on p-median problem and the facilities are considered to have limited capacities. We define a new binary variable (Z_is) for showing that customers are not assigned to any facilities. Our problem involve a bi-objective model; the first one minimizes the sum of facility construction costs and expected inventory holding costs, the second one function that mention for the first one is minimizes maximum expected customer costs under normal and failure scenarios. For solving this model we use NSGAII and MOSS algorithms have been applied to find the pareto- archive solution. Also Response Surface Methodology (RSM) is applied for optimizing the NSGAII Algorithm Parameters. We compare performance of two algorithms with three metrics and the results show NSGAII is more suitable for our model.Keywords: joint inventory-location problem, facility location, NSGAII, MOSS
Procedia PDF Downloads 525209 Measuring the Influence of Functional Proximity on Environmental Urban Performance via IMM: Four Study Cases in Milan
Authors: Massimo Tadi, M. Hadi Mohammad Zadeh, Ozge Ogut
Abstract:
Although how cities’ forms are structured is studied, more efforts are needed on systemic comprehensions and evaluations of the urban morphology through quantitative metrics that are able to describe the performance of a city in relation to its formal properties. More research is required in this direction in order to better describe the urban form characteristics and their impact on the environmental performance of cities and to increase their sustainability stewardship. With the aim of developing a better understanding of the built environment’s systemic structure, the intention of this paper is to present a holistic methodology for studying the behavior of the built environment and investigate the methods for measuring the effect of urban structure to the environmental performance. This goal will be pursued through an inquiry into the morphological components of the urban systems and the complex relationships between them. Particularly, this paper focuses on proximity, referring to the proximity of different land-uses, is a concept with which Integrated Modification Methodology (IMM) explains how land-use allocation might affect the choice of mobility in neighborhoods, and especially, encourage or discourage non-motived mobility. This paper uses proximity to demonstrate that the structure attributes can quantifiably relate to the performing behavior in the city. The target is to devise a mathematical pattern from the structural elements and correlate it directly with urban performance indicators concerned with environmental sustainability. The paper presents some results of this rigorous investigation of urban proximity and its correlation with performance indicators in four different areas in the city of Milan, each of them characterized by different morphological features.Keywords: built environment, ecology, sustainable indicators, sustainability, urban morphology
Procedia PDF Downloads 168208 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning
Authors: Joseph George, Anne Kotteswara Roa
Abstract:
Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.Keywords: skin cancer, deep learning, performance measures, accuracy, datasets
Procedia PDF Downloads 129207 ALEF: An Enhanced Approach to Arabic-English Bilingual Translation
Authors: Abdul Muqsit Abbasi, Ibrahim Chhipa, Asad Anwer, Saad Farooq, Hassan Berry, Sonu Kumar, Sundar Ali, Muhammad Owais Mahmood, Areeb Ur Rehman, Bahram Baloch
Abstract:
Accurate translation between structurally diverse languages, such as Arabic and English, presents a critical challenge in natural language processing due to significant linguistic and cultural differences. This paper investigates the effectiveness of Facebook’s mBART model, fine-tuned specifically for sequence-tosequence (seq2seq) translation tasks between Arabic and English, and enhanced through advanced refinement techniques. Our approach leverages the Alef Dataset, a meticulously curated parallel corpus spanning various domains to capture the linguistic richness, nuances, and contextual accuracy essential for high-quality translation. We further refine the model’s output using advanced language models such as GPT-3.5 and GPT-4, which improve fluency, coherence, and correct grammatical errors in translated texts. The fine-tuned model demonstrates substantial improvements, achieving a BLEU score of 38.97, METEOR score of 58.11, and TER score of 56.33, surpassing widely used systems such as Google Translate. These results underscore the potential of mBART, combined with refinement strategies, to bridge the translation gap between Arabic and English, providing a reliable, context-aware machine translation solution that is robust across diverse linguistic contexts.Keywords: natural language processing, machine translation, fine-tuning, Arabic-English translation, transformer models, seq2seq translation, translation evaluation metrics, cross-linguistic communication
Procedia PDF Downloads 18206 Influence of Travel Time Reliability on Elderly Drivers Crash Severity
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling
Procedia PDF Downloads 494205 Tractography Analysis of the Evolutionary Origin of Schizophrenia
Authors: Asmaa Tahiri, Mouktafi Amine
Abstract:
A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that fundamental understanding of the underlying causes of the majority psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics; Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).Keywords: tractography, evolutionary psychology, schizophrenia, brain connectivity
Procedia PDF Downloads 71204 Ultrasound-Assisted Extraction of Bioactive Compounds from Cocoa Shell and Their Encapsulation in Gum Arabic and Maltodextrin: A Technology to Produce Functional Food Ingredients
Authors: Saeid Jafari, Khursheed Ahmad Sheikh, Randy W. Worobo, Kitipong Assatarakul
Abstract:
In this study, the extraction of cocoa shell powder (CSP) was optimized, and the optimized extracts were spray-dried for encapsulation purposes. Temperature (45-65 ◦C), extraction time (30–60 min), and ethanol concentration (60–100%) were the extraction parameters. The response surface methodology analysis revealed that the model was significant (p ≤ 0.05) in interactions between all variables (total phenolic compound, total flavonoid content, and antioxidant activity as measured by 2,2-Diphenyl-1-picrylhydrazyl (DPPH) and ferric reducing antioxidant power (FRAP assays), with a lack of fit test for the model being insignificant (p > 0.05). Temperature (55 ◦C), time (45 min), and ethanol concentration (60%) were found to be the optimal extraction conditions. For spray-drying encapsulation, some quality metrics (e.g., water solubility, water activity) were insignificant (p > 0.05). The microcapsules were found to be spherical in shape using a scanning electron microscope. Thermogravimetric and differential thermogravimetric measurements of the microcapsules revealed nearly identical results. The gum arabic + maltodextrin microcapsule (GMM) showed potential antibacterial (zone of inhibition: 11.50 mm; lower minimum inhibitory concentration: 1.50 mg/mL) and antioxidant (DPPH: 1063 mM trolox/100g dry wt.) activities (p ≤ 0.05). In conclusion, the microcapsules in this study, particularly GMM, are promising antioxidant and antibacterial agents to be fortified as functional food ingredients for the production of nutraceutical foods with health-promoting properties.Keywords: functional foods, coco shell powder, antioxidant activity, encapsulation, extraction
Procedia PDF Downloads 57203 Leveraging Mobile Apps for Citizen-Centric Urban Planning: Insights from Tajawob Implementation
Authors: Alae El Fahsi
Abstract:
This study explores the ‘Tajawob’ app's role in urban development, demonstrating how mobile applications can empower citizens and facilitate urban planning. Tajawob serves as a digital platform for community feedback, engagement, and participatory governance, addressing urban challenges through innovative tech solutions. This research synthesizes data from a variety of sources, including user feedback, engagement metrics, and interviews with city officials, to assess the app’s impact on citizen participation in urban development in Morocco. By integrating advanced data analytics and user experience design, Tajawob has bridged the communication gap between citizens and government officials, fostering a more collaborative and transparent urban planning process. The findings reveal a significant increase in civic engagement, with users actively contributing to urban management decisions, thereby enhancing the responsiveness and inclusivity of urban governance. Challenges such as digital literacy, infrastructure limitations, and privacy concerns are also discussed, providing a comprehensive overview of the obstacles and opportunities presented by mobile app-based citizen engagement platforms. The study concludes with strategic recommendations for scaling the Tajawob model to other contexts, emphasizing the importance of adaptive technology solutions in meeting the evolving needs of urban populations. This research contributes to the burgeoning field of smart city innovations, offering key insights into the role of digital tools in facilitating more democratic and participatory urban environments.Keywords: smart cities, digital governance, urban planning, strategic design
Procedia PDF Downloads 59202 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach
Authors: M. Taheri Tehrani, H. Ajorloo
Abstract:
In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems
Procedia PDF Downloads 518201 Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks
Authors: Gunasekaran Raja, Ramkumar Jayaraman
Abstract:
In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.Keywords: cross layer network topology, concurrent scheduling, modularity value, network communities and weighted load balancing
Procedia PDF Downloads 265200 Comparison of Different Machine Learning Algorithms for Solubility Prediction
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.Keywords: random forest, machine learning, comparison, feature extraction
Procedia PDF Downloads 41199 Assessment of Mountain Hydrological Processes in the Gumera Catchment, Ethiopia
Authors: Tewele Gebretsadkan Haile
Abstract:
Mountain terrains are essential to regional water resources by regulating hydrological processes that use downstream water supplies. Nevertheless, limited observed earth data in complex topography poses challenges for water resources regulation. That's why satellite product is implemented in this study. This study evaluates hydrological processes on mountain catchment of Gumera, Ethiopia using HBV-light model with satellite precipitation products (CHIRPS) for the temporal scale of 1996 to 2010 and area coverage of 1289 km2. The catchment is characterized by cultivation dominant and elevation ranges from 1788 to 3606 m above sea level. Three meteorological stations have been used for downscaling of the satellite data and one stream flow for calibration and validation. The result shows total annual water balance showed that precipitation 1410 mm, simulated 828 mm surface runoff compared to 1042 mm observed stream flow with actual evapotranspiration estimate 586mm and 1495mm potential evapotranspiration. The temperature range is 9°C in winter to 21°C. The catchment contributes 74% as quack runoff to the total runoff and 26% as lower groundwater storage, which sustains stream flow during low periods. The model uncertainty was measured using different metrics such as coefficient of determination, model efficiency, efficiency for log(Q) and flow weighted efficiency 0.76, 0.74, 0.66 and 0.70 respectively. The research result highlights that HBV model captures the mountain hydrology simulation and the result indicates quack runoff due to the traditional agricultural system, slope factor of the topography and adaptation measure for water resource management is recommended.Keywords: mountain hydrology, CHIRPS, Gumera, HBV model
Procedia PDF Downloads 13198 Quantification of the Gumera Catchment's Mountain Hydrological Processes in Ethiopia
Authors: Tewele Gebretsadkan Haile
Abstract:
Mountain terrains are essential to regional water resources by regulating hydrological processes that use downstream water supplies. Nevertheless, limited observed earth data in complex topography poses challenges for water resources regulation. That's why satellite product is implemented in this study. This study evaluates hydrological processes on mountain catchment of Gumera, Ethiopia using HBV-light model with satellite precipitation products (CHIRPS) for the temporal scale of 1996 to 2010 and area coverage of 1289 km2. The catchment is characterized by cultivation dominant and elevation ranges from 1788 to 3606 m above sea level. Three meteorological stations have been used for downscaling of the satellite data and one stream flow for calibration and validation. The result shows total annual water balance showed that precipitation 1410 mm, simulated 828 mm surface runoff compared to 1042 mm observed stream flow with actual evapotranspiration estimate 586mm and 1495mm potential evapotranspiration. The temperature range is 9°C in winter to 21°C. The catchment contributes 74% as quack runoff to the total runoff and 26% as lower groundwater storage, which sustains stream flow during low periods. The model uncertainty was measured using different metrics such as coefficient of determination, model efficiency, efficiency for log(Q) and flow weighted efficiency 0.76, 0.74, 0.66 and 0.70 respectively. The research result highlights that HBV model captures the mountain hydrology simulation and the result indicates quack runoff due to the traditional agricultural system, slope factor of the topography and adaptation measure for water resource management is recommended.Keywords: mountain hydrology, CHIRPS, HBV model, Gumera
Procedia PDF Downloads 13197 A Comparative Time-Series Analysis and Deep Learning Projection of Innate Radon Gas Risk in Canadian and Swedish Residential Buildings
Authors: Selim M. Khan, Dustin D. Pearson, Tryggve Rönnqvist, Markus E. Nielsen, Joshua M. Taron, Aaron A. Goodarzi
Abstract:
Accumulation of radioactive radon gas in indoor air poses a serious risk to human health by increasing the lifetime risk of lung cancer and is classified by IARC as a category one carcinogen. Radon exposure risks are a function of geologic, geographic, design, and human behavioural variables and can change over time. Using time series and deep machine learning modelling, we analyzed long-term radon test outcomes as a function of building metrics from 25,489 Canadian and 38,596 Swedish residential properties constructed between 1945 to 2020. While Canadian and Swedish properties built between 1970 and 1980 are comparable (96–103 Bq/m³), innate radon risks subsequently diverge, rising in Canada and falling in Sweden such that 21st Century Canadian houses show 467% greater average radon (131 Bq/m³) relative to Swedish equivalents (28 Bq/m³). These trends are consistent across housing types and regions within each country. The introduction of energy efficiency measures within Canadian and Swedish building codes coincided with opposing radon level trajectories in each nation. Deep machine learning modelling predicts that, without intervention, average Canadian residential radon levels will increase to 176 Bq/m³ by 2050, emphasizing the importance and urgency of future building code intervention to achieve systemic radon reduction in Canada.Keywords: radon health risk, time-series, deep machine learning, lung cancer, Canada, Sweden
Procedia PDF Downloads 85196 Predicting Emerging Agricultural Investment Opportunities: The Potential of Structural Evolution Index
Authors: Kwaku Damoah
Abstract:
The agricultural sector is characterized by continuous transformation, driven by factors such as demographic shifts, evolving consumer preferences, climate change, and migration trends. This dynamic environment presents complex challenges for key stakeholders including farmers, governments, and investors, who must navigate these changes to achieve optimal investment returns. To effectively predict market trends and uncover promising investment opportunities, a systematic, data-driven approach is essential. This paper introduces the Structural Evolution Index (SEI), a machine learning-based methodology. SEI is specifically designed to analyse long-term trends and forecast the potential of emerging agricultural products for investment. Versatile in application, it evaluates various agricultural metrics such as production, yield, trade, land use, and consumption, providing a comprehensive view of the evolution within agricultural markets. By harnessing data from the UN Food and Agricultural Organisation (FAOSTAT), this study demonstrates the SEI's capabilities through Comparative Exploratory Analysis and evaluation of international trade in agricultural products, focusing on Malaysia and Singapore. The SEI methodology reveals intricate patterns and transitions within the agricultural sector, enabling stakeholders to strategically identify and capitalize on emerging markets. This predictive framework is a powerful tool for decision-makers, offering crucial insights that help anticipate market shifts and align investments with anticipated returns.Keywords: agricultural investment, algorithm, comparative exploratory analytics, machine learning, market trends, predictive analytics, structural evolution index
Procedia PDF Downloads 63195 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees
Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.Keywords: cloud storage, decision trees, diagnostic image, search, telemedicine
Procedia PDF Downloads 204