Search results for: cohesion metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 842

Search results for: cohesion metrics

242 A Qualitative Evaluation of a Civic Curriculum to Increase Global Citizenship Competences in University Students in the Netherlands

Authors: Park Eri, Sklad Marcin, Tsirogianni Stavroula

Abstract:

In a world where there is increasing exchange and movement of populations groups, and interconnectedness, there are plenty of opportunities for mutual cultural enrichment. However, in everyday life, relations among different cultural groups do not go that smoothly often resulting in discrimination, inequalities and violence. The increasing differentiation of roles, values and worldviews raise a lot of tensions and dilemmas for the state and people -especially in western liberal societies- about issues of acceptance, fairness, justice, autonomy, plurality, freedom, equality and cohesion. Cultural diversity requires a deeper understanding of the roots, meaning and consequences of group differences. We argue, that a psychology from the standpoint of the subject needs to be developed further according to new societal needs. This means within a globalised society, issues regarding the construction of the other as another have become of utmost importance. In constructing the other human beings construct their ideal and possible worlds and meanings about their lives and their significance by drawing on a set of cultural norms, beliefs and values embedded in the different contexts whereby they find themselves in. In this article, we are describing a series of exercises developed in collaboration with University students in the Netherlands that have been piloted with undergraduate 2nd year University Psychology students. These exercises aimed at making tangible and obvious how students apply different moral principles and norms to regulate relationships, which are linked to hegemonic ideological forces. The exercises were in the form of thought experiments that included 8 moral dilemmas, inspired by the moral foundations theory, that touched on different moral principles. The moral dilemmas were built onto each other in incremental steps: from a very tangible/hands-on level to more challenging and demanding ones which require to step into pre-existing networks on knowledge and discourses. After the execution of every dilemma, a discussion followed, which is focused on building links between the ‘theme of the exercise’ and participants’ own lives experiences. In this paper, we provide an evaluation of the methodology used through a discursive analysis of the discussion between the students and the teacher.

Keywords: citizenship, moral dilemmas, social justice, education

Procedia PDF Downloads 293
241 Assessing the NYC's Single-Family Housing Typology for Urban Heat Vulnerability and Occupants’ Health Risk under the Climate Change Emergency

Authors: Eleni Stefania Kalapoda

Abstract:

Recurring heat waves due to the global climate change emergency pose continuous risks to human health and urban resources. Local and state decision-makers incorporate Heat Vulnerability Indices (HVIs) to quantify and map the relative impact on human health in emergencies. These maps enable government officials to identify the highest-risk districts and to concentrate emergency planning efforts and available resources accordingly (e.g., to reevaluate the location and the number of heat-relief centers). Even though the framework of conducting an HVI is unique per municipality, its accuracy in assessing the heat risk is limited. To resolve this issue, varied housing-related metrics should be included. This paper quantifies and classifies NYC’s single detached housing typology within high-vulnerable NYC districts using detailed energy simulations and post-processing calculations. The results show that the variation in indoor heat risk depends significantly on the dwelling’s design/operation characteristics, concluding that low-ventilated dwellings are the most vulnerable ones. Also, it confirmed that when building-level determinants of exposure are excluded from the assessment, HVI fails to capture important components of heat vulnerability. Lastly, the overall vulnerability ratio of the housing units was calculated between 0.11 to 1.6 indoor heat degrees in terms of ventilation and shading capacity, insulation degree, and other building attributes.

Keywords: heat vulnerability index, energy efficiency, urban heat, resiliency to heat, climate adaptation, climate mitigation, building energy

Procedia PDF Downloads 56
240 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication

Authors: Vedant Janapaty

Abstract:

Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.

Keywords: estuary, remote sensing, machine learning, Fourier transform

Procedia PDF Downloads 79
239 Optical Board as an Artificial Technology for a Peer Teaching Class in a Nigerian University

Authors: Azidah Abu Ziden, Adu Ifedayo Emmanuel

Abstract:

This study investigated the optical board as an artificial technology for peer teaching in a Nigerian university. A design and development research (DDR) design was adopted, which entailed the planning and testing of instructional design models adopted to produce the optical board. This research population involved twenty-five (25) peer-teaching students at a Nigerian university consisting of theatre arts, religion, and language education-related disciplines. Also, using a random sampling technique, this study selected eight (8) students to work on the optical board. Besides, this study introduced a research instrument titled lecturer assessment rubric containing 30-mark metrics for evaluating students’ teaching with the optical board. In this study, it was discovered that the optical board affords students acquisition of self-employment skills through their exposure to the peer teaching course, which is a teacher training module in Nigerian universities. It is evident in this study that students were able to coordinate their design and effectively develop the optical board without lecturer’s interference. This kind of achievement in this research shows that the Nigerian university curriculum had been designed with contents meant to spur students to create jobs after graduation, and effective implementation of the readily available curriculum contents is enough to imbue students with the needed entrepreneurial skills. It was recommended that the Federal Government of Nigeria (FGN) must discourage the poor implementation of Nigerian university curriculum and invest more in the betterment of the readily available curriculum instead of considering a synonymously acclaimed new curriculum for regurgitated teaching and learning process.

Keywords: optical board, artificial technology, peer teaching, educational technology, Nigeria, Malaysia, university, glass, wood, electrical, improvisation

Procedia PDF Downloads 46
238 Empowering Transformers for Evidence-Based Medicine

Authors: Jinan Fiaidhi, Hashmath Shaik

Abstract:

Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.

Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers

Procedia PDF Downloads 14
237 A Multi Objective Reliable Location-Inventory Capacitated Disruption Facility Problem with Penalty Cost Solve with Efficient Meta Historic Algorithms

Authors: Elham Taghizadeh, Mostafa Abedzadeh, Mostafa Setak

Abstract:

Logistics network is expected that opened facilities work continuously for a long time horizon without any failure; but in real world problems, facilities may face disruptions. This paper studies a reliable joint inventory location problem to optimize cost of facility locations, customers’ assignment, and inventory management decisions when facilities face failure risks and doesn’t work. In our model we assume when a facility is out of work, its customers may be reassigned to other operational facilities otherwise they must endure high penalty costs associated with losing service. For defining the model closer to real world problems, the model is proposed based on p-median problem and the facilities are considered to have limited capacities. We define a new binary variable (Z_is) for showing that customers are not assigned to any facilities. Our problem involve a bi-objective model; the first one minimizes the sum of facility construction costs and expected inventory holding costs, the second one function that mention for the first one is minimizes maximum expected customer costs under normal and failure scenarios. For solving this model we use NSGAII and MOSS algorithms have been applied to find the pareto- archive solution. Also Response Surface Methodology (RSM) is applied for optimizing the NSGAII Algorithm Parameters. We compare performance of two algorithms with three metrics and the results show NSGAII is more suitable for our model.

Keywords: joint inventory-location problem, facility location, NSGAII, MOSS

Procedia PDF Downloads 499
236 Global Capitalism and Commodification of Breastfeeding: An Investigation of Its Impact on the “Traditional” African Conception of Family Life and Motherhood

Authors: Mosito Jonas Seabela

Abstract:

Breastfeeding in public has become a contentious issue in contemporary society. Mothers are often subjected to unfair discrimination and harassment for simply responding to their maternal instinct to breastfeed their infants. The unwillingness of society to accept public breastfeeding as a natural, non-sexual act is partly influenced by the imposition of a pornified and hypersexualised Western culture, which was imported to Africa through colonisation, enforced by the apartheid regime, and is now perpetuated by Western media. The imposition of the modern nuclear family on Africans, and the coerced aspiration to subscribe to bourgeois values, has eroded the moral standing of the traditional African family and its cultural values. Western-centric perceptions of African women have altered the experience of motherhood for many, commodifying the practice of breastfeeding. As a result, the use of bottles and infant formulas is often perceived as the preferred method, while breastfeeding in public is viewed as primitive, immoral, and unacceptable. This normative study seeks to answer the question of what ought to be done to preserve the dignity of African motherhood and protect their right to breastfeed in public. The African philosophy of Ubuntu is employed to advocate for the right to breastfeed in public. This moral philosophy posits that the western perception of a person seeks to isolate people from their environment and culture, thereby undermining the process of acquiring humanity, which fosters social cohesion. The Ubuntu philosophy embodies the aphorism, “umuntu ngumuntu nga bantu”, meaning “a person is a person through other persons”, signifying people’s interconnectedness and interdependence. The application of the key principles of Ubuntu, such as “survival, the spirit of solidarity, compassion, respect, and dignity” can improve human interaction and unite the public to support the government’s efforts to increase exclusive breastfeeding rates and reduce infant mortality rates. A doctrine called “Ubuntu Lactivism” is what the author proposes as a means to advocate for breastfeeding rights in fulfilment of African traditional values.

Keywords: ubuntu, breastfeeding, Afrocentric, colonization, culture, motherhood, imperialism, objectification

Procedia PDF Downloads 50
235 Measuring the Influence of Functional Proximity on Environmental Urban Performance via IMM: Four Study Cases in Milan

Authors: Massimo Tadi, M. Hadi Mohammad Zadeh, Ozge Ogut

Abstract:

Although how cities’ forms are structured is studied, more efforts are needed on systemic comprehensions and evaluations of the urban morphology through quantitative metrics that are able to describe the performance of a city in relation to its formal properties. More research is required in this direction in order to better describe the urban form characteristics and their impact on the environmental performance of cities and to increase their sustainability stewardship. With the aim of developing a better understanding of the built environment’s systemic structure, the intention of this paper is to present a holistic methodology for studying the behavior of the built environment and investigate the methods for measuring the effect of urban structure to the environmental performance. This goal will be pursued through an inquiry into the morphological components of the urban systems and the complex relationships between them. Particularly, this paper focuses on proximity, referring to the proximity of different land-uses, is a concept with which Integrated Modification Methodology (IMM) explains how land-use allocation might affect the choice of mobility in neighborhoods, and especially, encourage or discourage non-motived mobility. This paper uses proximity to demonstrate that the structure attributes can quantifiably relate to the performing behavior in the city. The target is to devise a mathematical pattern from the structural elements and correlate it directly with urban performance indicators concerned with environmental sustainability. The paper presents some results of this rigorous investigation of urban proximity and its correlation with performance indicators in four different areas in the city of Milan, each of them characterized by different morphological features.

Keywords: built environment, ecology, sustainable indicators, sustainability, urban morphology

Procedia PDF Downloads 143
234 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning

Authors: Joseph George, Anne Kotteswara Roa

Abstract:

Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.

Keywords: skin cancer, deep learning, performance measures, accuracy, datasets

Procedia PDF Downloads 100
233 Influence of Travel Time Reliability on Elderly Drivers Crash Severity

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.

Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling

Procedia PDF Downloads 465
232 Tractography Analysis of the Evolutionary Origin of Schizophrenia

Authors: Asmaa Tahiri, Mouktafi Amine

Abstract:

A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that fundamental understanding of the underlying causes of the majority psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics; Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).

Keywords: tractography, evolutionary psychology, schizophrenia, brain connectivity

Procedia PDF Downloads 37
231 Ultrasound-Assisted Extraction of Bioactive Compounds from Cocoa Shell and Their Encapsulation in Gum Arabic and Maltodextrin: A Technology to Produce Functional Food Ingredients

Authors: Saeid Jafari, Khursheed Ahmad Sheikh, Randy W. Worobo, Kitipong Assatarakul

Abstract:

In this study, the extraction of cocoa shell powder (CSP) was optimized, and the optimized extracts were spray-dried for encapsulation purposes. Temperature (45-65 ◦C), extraction time (30–60 min), and ethanol concentration (60–100%) were the extraction parameters. The response surface methodology analysis revealed that the model was significant (p ≤ 0.05) in interactions between all variables (total phenolic compound, total flavonoid content, and antioxidant activity as measured by 2,2-Diphenyl-1-picrylhydrazyl (DPPH) and ferric reducing antioxidant power (FRAP assays), with a lack of fit test for the model being insignificant (p > 0.05). Temperature (55 ◦C), time (45 min), and ethanol concentration (60%) were found to be the optimal extraction conditions. For spray-drying encapsulation, some quality metrics (e.g., water solubility, water activity) were insignificant (p > 0.05). The microcapsules were found to be spherical in shape using a scanning electron microscope. Thermogravimetric and differential thermogravimetric measurements of the microcapsules revealed nearly identical results. The gum arabic + maltodextrin microcapsule (GMM) showed potential antibacterial (zone of inhibition: 11.50 mm; lower minimum inhibitory concentration: 1.50 mg/mL) and antioxidant (DPPH: 1063 mM trolox/100g dry wt.) activities (p ≤ 0.05). In conclusion, the microcapsules in this study, particularly GMM, are promising antioxidant and antibacterial agents to be fortified as functional food ingredients for the production of nutraceutical foods with health-promoting properties.

Keywords: functional foods, coco shell powder, antioxidant activity, encapsulation, extraction

Procedia PDF Downloads 35
230 Leveraging Mobile Apps for Citizen-Centric Urban Planning: Insights from Tajawob Implementation

Authors: Alae El Fahsi

Abstract:

This study explores the ‘Tajawob’ app's role in urban development, demonstrating how mobile applications can empower citizens and facilitate urban planning. Tajawob serves as a digital platform for community feedback, engagement, and participatory governance, addressing urban challenges through innovative tech solutions. This research synthesizes data from a variety of sources, including user feedback, engagement metrics, and interviews with city officials, to assess the app’s impact on citizen participation in urban development in Morocco. By integrating advanced data analytics and user experience design, Tajawob has bridged the communication gap between citizens and government officials, fostering a more collaborative and transparent urban planning process. The findings reveal a significant increase in civic engagement, with users actively contributing to urban management decisions, thereby enhancing the responsiveness and inclusivity of urban governance. Challenges such as digital literacy, infrastructure limitations, and privacy concerns are also discussed, providing a comprehensive overview of the obstacles and opportunities presented by mobile app-based citizen engagement platforms. The study concludes with strategic recommendations for scaling the Tajawob model to other contexts, emphasizing the importance of adaptive technology solutions in meeting the evolving needs of urban populations. This research contributes to the burgeoning field of smart city innovations, offering key insights into the role of digital tools in facilitating more democratic and participatory urban environments.

Keywords: smart cities, digital governance, urban planning, strategic design

Procedia PDF Downloads 30
229 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach

Authors: M. Taheri Tehrani, H. Ajorloo

Abstract:

In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.

Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems

Procedia PDF Downloads 482
228 Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks

Authors: Gunasekaran Raja, Ramkumar Jayaraman

Abstract:

In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.

Keywords: cross layer network topology, concurrent scheduling, modularity value, network communities and weighted load balancing

Procedia PDF Downloads 233
227 Comparison of Different Machine Learning Algorithms for Solubility Prediction

Authors: Muhammet Baldan, Emel Timuçin

Abstract:

Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.

Keywords: random forest, machine learning, comparison, feature extraction

Procedia PDF Downloads 0
226 A Comparative Time-Series Analysis and Deep Learning Projection of Innate Radon Gas Risk in Canadian and Swedish Residential Buildings

Authors: Selim M. Khan, Dustin D. Pearson, Tryggve Rönnqvist, Markus E. Nielsen, Joshua M. Taron, Aaron A. Goodarzi

Abstract:

Accumulation of radioactive radon gas in indoor air poses a serious risk to human health by increasing the lifetime risk of lung cancer and is classified by IARC as a category one carcinogen. Radon exposure risks are a function of geologic, geographic, design, and human behavioural variables and can change over time. Using time series and deep machine learning modelling, we analyzed long-term radon test outcomes as a function of building metrics from 25,489 Canadian and 38,596 Swedish residential properties constructed between 1945 to 2020. While Canadian and Swedish properties built between 1970 and 1980 are comparable (96–103 Bq/m³), innate radon risks subsequently diverge, rising in Canada and falling in Sweden such that 21st Century Canadian houses show 467% greater average radon (131 Bq/m³) relative to Swedish equivalents (28 Bq/m³). These trends are consistent across housing types and regions within each country. The introduction of energy efficiency measures within Canadian and Swedish building codes coincided with opposing radon level trajectories in each nation. Deep machine learning modelling predicts that, without intervention, average Canadian residential radon levels will increase to 176 Bq/m³ by 2050, emphasizing the importance and urgency of future building code intervention to achieve systemic radon reduction in Canada.

Keywords: radon health risk, time-series, deep machine learning, lung cancer, Canada, Sweden

Procedia PDF Downloads 63
225 Predicting Emerging Agricultural Investment Opportunities: The Potential of Structural Evolution Index

Authors: Kwaku Damoah

Abstract:

The agricultural sector is characterized by continuous transformation, driven by factors such as demographic shifts, evolving consumer preferences, climate change, and migration trends. This dynamic environment presents complex challenges for key stakeholders including farmers, governments, and investors, who must navigate these changes to achieve optimal investment returns. To effectively predict market trends and uncover promising investment opportunities, a systematic, data-driven approach is essential. This paper introduces the Structural Evolution Index (SEI), a machine learning-based methodology. SEI is specifically designed to analyse long-term trends and forecast the potential of emerging agricultural products for investment. Versatile in application, it evaluates various agricultural metrics such as production, yield, trade, land use, and consumption, providing a comprehensive view of the evolution within agricultural markets. By harnessing data from the UN Food and Agricultural Organisation (FAOSTAT), this study demonstrates the SEI's capabilities through Comparative Exploratory Analysis and evaluation of international trade in agricultural products, focusing on Malaysia and Singapore. The SEI methodology reveals intricate patterns and transitions within the agricultural sector, enabling stakeholders to strategically identify and capitalize on emerging markets. This predictive framework is a powerful tool for decision-makers, offering crucial insights that help anticipate market shifts and align investments with anticipated returns.

Keywords: agricultural investment, algorithm, comparative exploratory analytics, machine learning, market trends, predictive analytics, structural evolution index

Procedia PDF Downloads 36
224 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees

Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.

Keywords: cloud storage, decision trees, diagnostic image, search, telemedicine

Procedia PDF Downloads 182
223 Virtual Team Management in Companies and Organizations

Authors: Asghar Zamani, Mostafa Falahmorad

Abstract:

Virtualization is established to combine and use the unique capabilities of employees to increase productivity and agility to provide services regardless of location. Adapting to fast and continuous change and getting maximum access to human resources are reasons why virtualization is happening. The distance problem is solved by information. Flexibility is the most important feature of virtualization, and information will be the main focus of virtualized companies. In this research, we used the Covid-19 opportunity window to assess the productivity of the companies that had been going through more virtualized management before the Covid-19 in comparison with those that just started planning on developing infrastructures on virtual management after the crises of pandemic occurred. The research process includes financial (profitability and customer satisfaction) and behavioral (organizational culture and reluctance to change) metrics assessment. In addition to financial and CRM KPIs, a questionnaire is devised to assess how manager and employees’ attitude has been changing towards the migration to virtualization. The sample companies and questions are selected by asking from experts in the IT industry of Iran. In this article, the conclusion is that companies open to virtualization based on accurate strategic planning or willing to pay to train their employees for virtualization before the pandemic are more agile in adapting to change and moving forward in recession. The prospective companies in this research, not only could compensate for the short period loss from the first shock of the Covid-19, but they could also foresee new needs of their customer sooner than other competitors, resulting in the need to employ new staff for executing the emerging demands. Findings were aligned with the literature review. Results can be a wake-up call for business owners especially in developing countries to be more resilient toward modern management styles instead of continuing with traditional ones.

Keywords: virtual management, virtual organization, competitive advantage, KPI, profit

Procedia PDF Downloads 61
222 Tractography Analysis and the Evolutionary Origin of Schizophrenia

Authors: Mouktafi Amine, Tahiri Asmaa

Abstract:

A substantial number of traditional medical research has been put forward to managing and treating mental disorders. At the present time, to our best knowledge, it is believed that a fundamental understanding of the underlying causes of the majority of psychological disorders needs to be explored further to inform early diagnosis, managing symptoms and treatment. The emerging field of evolutionary psychology is a promising prospect to address the origin of mental disorders, potentially leading to more effective treatments. Schizophrenia as a topical mental disorder has been linked to the evolutionary adaptation of the human brain represented in the brain connectivity and asymmetry directly linked to humans' higher brain cognition in contrast to other primates being our direct living representation of the structure and connectivity of our earliest common African ancestors. As proposed in the evolutionary psychology scientific literature, the pathophysiology of schizophrenia is expressed and directly linked to altered connectivity between the Hippocampal Formation (HF) and Dorsolateral Prefrontal Cortex (DLPFC). This research paper presents the results of the use of tractography analysis using multiple open access Diffusion Weighted Imaging (DWI) datasets of healthy subjects, schizophrenia-affected subjects and primates to illustrate the relevance of the aforementioned brain regions' connectivity and the underlying evolutionary changes in the human brain. Deterministic fiber tracking and streamline analysis were used to generate connectivity matrices from the DWI datasets overlaid to compute distances and highlight disconnectivity patterns in conjunction with other fiber tracking metrics: Fractional Anisotropy (FA), Mean Diffusivity (MD) and Radial Diffusivity (RD).

Keywords: tractography, diffusion weighted imaging, schizophrenia, evolutionary psychology

Procedia PDF Downloads 18
221 The Sociocultural, Economic, and Environmental Contestations of Agbogbloshie: A Critical Review

Authors: Khiddir Iddris, Martin Oteng – Ababio, Andreas Bürkert, Christoph Scherrer, Katharina Hemmler

Abstract:

Agbogbloshie, as an informal settlement and economy where the e-waste sector thrives, has become a global hub of complex urban contestations involving sociocultural, economic, and environmental dimensions due to the implication that e-waste and informal economic patterns have on livelihoods, urbanisation, development and sustainability. Multi-author collaborations have produced an ever-growing body of literature on Agbogbloshie and the informal e-waste economy. There is, however, a dearth of an assessment of Agbogbloshie as an urban informal settlement's intricate nexus of socioecological contestations. We address this gap by systematising, from literature, the context knowledge, navigating the complex terrain of Agbogbloshie's challenges, and employing a multidimensional lens to unravel the sociocultural intricacies, economic dynamics, and environmental complexities shaping its identity. A systematic critical review approach was espoused, with a pragmatic consolidation of content analysis and controversy mapping, grounded on the concept of ‘sustainable rurbanism,’ highlighted core themes and identified contrasting viewpoints. An analytical framework is presented. Five categories – geohistorical, sociocultural, economic, environmental and future trends - are proposed as an approach to systematising the literature. The review finds that the sociocultural dimension unveils a mosaic of cultural amalgamation, communal identity, and tensions impacting community cohesion. The analysis of economic intricacies reveals the prevalence of informal economies sustaining livelihoods yet entrenching economic disparities and marginalisation. Environmental scrutiny exposes the grim realities of e-waste disposal, pollution, and land use conflicts. The findings suggest that there is a high resilience within the community and the potential for sustainable trajectories. Theoretical and conceptual synergy is limited. This review provides a comprehensive exploration, offering insights and directions for future research, policy formulation, and community-driven interventions aimed at fostering sustainable transformations in Agbogbloshie and analogous urban contexts.

Keywords: Agbogbloshie, economic complexities, environmental challenges, resilience, sociocultural dynamics, sustainability, urban informal settlement

Procedia PDF Downloads 41
220 EcoMush: Mapping Sustainable Mushroom Production in Bangladesh

Authors: A. A. Sadia, A. Emdad, E. Hossain

Abstract:

The increasing importance of mushrooms as a source of nutrition, health benefits, and even potential cancer treatment has raised awareness of the impact of climate-sensitive variables on their cultivation. Factors like temperature, relative humidity, air quality, and substrate composition play pivotal roles in shaping mushroom growth, especially in Bangladesh. Oyster mushrooms, a commonly cultivated variety in this region, are particularly vulnerable to climate fluctuations. This research explores the climatic dynamics affecting oyster mushroom cultivation and, presents an approach to address these challenges and provides tangible solutions to fortify the agro-economy, ensure food security, and promote the sustainability of this crucial food source. Using climate and production data, this study evaluates the performance of three clustering algorithms -KMeans, OPTICS, and BIRCH- based on various quality metrics. While each algorithm demonstrates specific strengths, the findings provide insights into their effectiveness for this specific dataset. The results yield essential information, pinpointing the optimal temperature range of 13°C-22°C, the unfavorable temperature threshold of 28°C and above, and the ideal relative humidity range of 75-85% with the suitable production regions in three different seasons: Kharif-1, 2, and Robi. Additionally, a user-friendly web application is developed to support mushroom farmers in making well-informed decisions about their cultivation practices. This platform offers valuable insights into the most advantageous periods for oyster mushroom farming, with the overarching goal of enhancing the efficiency and profitability of mushroom farming.

Keywords: climate variability, mushroom cultivation, clustering techniques, food security, sustainability, web-application

Procedia PDF Downloads 40
219 Diabetes Mellitus and Blood Glucose Variability Increases the 30-day Readmission Rate after Kidney Transplantation

Authors: Harini Chakkera

Abstract:

Background: Inpatient hyperglycemia is an established independent risk factor among several patient cohorts with hospital readmission. This has not been studied after kidney transplantation. Nearly one-third of patients who have undergone a kidney transplant reportedly experience 30-day readmission. Methods: Data on first-time solitary kidney transplantations were retrieved between September 2015 to December 2018. Information was linked to the electronic health record to determine a diagnosis of diabetes mellitus and extract glucometeric and insulin therapy data. Univariate logistic regression analysis and the XGBoost algorithm were used to predict 30-day readmission. We report the average performance of the models on the testing set on five bootstrapped partitions of the data to ensure statistical significance. Results: The cohort included 1036 patients who received kidney transplantation, and 224 (22%) experienced 30-day readmission. The machine learning algorithm was able to predict 30-day readmission with an average AUC of 77.3% (95% CI 75.30-79.3%). We observed statistically significant differences in the presence of pretransplant diabetes, inpatient-hyperglycemia, inpatient-hypoglycemia, and minimum and maximum glucose values among those with higher 30-day readmission rates. The XGBoost model identified the index admission length of stay, presence of hyper- and hypoglycemia and recipient and donor BMI values as the most predictive risk factors of 30-day readmission. Additionally, significant variations in the therapeutic management of blood glucose by providers were observed. Conclusions: Suboptimal glucose metrics during hospitalization after kidney transplantation is associated with an increased risk for 30-day hospital readmission. Optimizing the hospital blood glucose management, a modifiable factor, after kidney transplantation may reduce the risk of 30-day readmission.

Keywords: kidney, transplant, diabetes, insulin

Procedia PDF Downloads 54
218 Remote Sensing through Deep Neural Networks for Satellite Image Classification

Authors: Teja Sai Puligadda

Abstract:

Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.

Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss

Procedia PDF Downloads 131
217 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory

Authors: Roy. H. A. Lindelauf

Abstract:

Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.

Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques

Procedia PDF Downloads 120
216 Shear Strength Characteristics of Sand Mixed with Particulate Rubber

Authors: Firas Daghistani, Hossam Abuel Naga

Abstract:

Waste tyres is a global problem that has a negative effect on the environment, where there are approximately one billion waste tyres discarded worldwide yearly. Waste tyres are discarded in stockpiles, where they provide harm to the environment in many ways. Finding applications to these materials can help in reducing this global problem. One of these applications is recycling these waste materials and using them in geotechnical engineering. Recycled waste tyre particulates can be mixed with sand to form a lightweight material with varying shear strength characteristics. Contradicting results were found in the literature on the inclusion of particulate rubber to sand, where some experiments found that the inclusion of particulate rubber can increase the shear strength of the mixture, while other experiments stated that the addition of particulate rubber decreases the shear strength of the mixture. This research further investigates the inclusion of particulate rubber to sand and whether it can increase or decrease the shear strength characteristics of the mixture. For the experiment, a series of direct shear tests were performed on a poorly graded sand with a mean particle size of 0.32 mm mixed with recycled poorly graded particulate rubber with a mean particle size of 0.51 mm. The shear tests were performedon four normal stresses 30, 55, 105, 200 kPa at a shear rate of 1 mm/minute. Different percentages ofparticulate rubber content were used in the mixture i.e., 10%, 20%, 30% and 50% of sand dry weight at three density states, namely loose, slight dense, and dense state. The size ratio of the mixture,which is the mean particle size of the particulate rubber divided by the mean particle size of the sand, was 1.59. The results identified multiple parameters that can influence the shear strength of the mixture. The parameters were: normal stress, particulate rubber content, mixture gradation, mixture size ratio, and the mixture’s density. The inclusion of particulate rubber tosand showed a decrease to the internal friction angle and an increase to the apparent cohesion. Overall, the inclusion of particulate rubber did not have a significant influenceon the shear strength of the mixture. For all the dense states at the low normal stresses 33 and 55 kPa, the inclusion of particulate rubber showed aslight increase in the shear strength where the peak was at 20% rubber content of the sand’s dry weight. On the other hand, at the high normal stresses 105, and 200 kPa, there was a slight decrease in the shear strength.

Keywords: shear strength, direct shear, sand-rubber mixture, waste material, granular material

Procedia PDF Downloads 110
215 An Extensive Review Of Drought Indices

Authors: Shamsulhaq Amin

Abstract:

Drought can arise from several hydrometeorological phenomena that result in insufficient precipitation, soil moisture, and surface and groundwater flow, leading to conditions that are considerably drier than the usual water content or availability. Drought is often assessed using indices that are associated with meteorological, agricultural, and hydrological phenomena. In order to effectively handle drought disasters, it is essential to accurately determine the kind, intensity, and extent of the drought using drought characterization. This information is critical for managing the drought before, during, and after the rehabilitation process. Over a hundred drought assessments have been created in literature to evaluate drought disasters, encompassing a range of factors and variables. Some models utilise solely hydrometeorological drivers, while others employ remote sensing technology, and some incorporate a combination of both. Comprehending the entire notion of drought and taking into account drought indices along with their calculation processes are crucial for researchers in this discipline. Examining several drought metrics in different studies requires additional time and concentration. Hence, it is crucial to conduct a thorough examination of approaches used in drought indices in order to identify the most straightforward approach to avoid any discrepancies in numerous scientific studies. In case of practical application in real-world, categorizing indices relative to their usage in meteorological, agricultural, and hydrological phenomena might help researchers maximize their efficiency. Users have the ability to explore different indexes at the same time, allowing them to compare the convenience of use and evaluate the benefits and drawbacks of each. Moreover, certain indices exhibit interdependence, which enhances comprehension of their connections and assists in making informed decisions about their suitability in various scenarios. This study provides a comprehensive assessment of various drought indices, analysing their types and computation methodologies in a detailed and systematic manner.

Keywords: drought classification, drought severity, drought indices, agricultur, hydrological

Procedia PDF Downloads 16
214 ADP Approach to Evaluate the Blood Supply Network of Ontario

Authors: Usama Abdulwahab, Mohammed Wahab

Abstract:

This paper presents the application of uncapacitated facility location problems (UFLP) and 1-median problems to support decision making in blood supply chain networks. A plethora of factors make blood supply-chain networks a complex, yet vital problem for the regional blood bank. These factors are rapidly increasing demand; criticality of the product; strict storage and handling requirements; and the vastness of the theater of operations. As in the UFLP, facilities can be opened at any of $m$ predefined locations with given fixed costs. Clients have to be allocated to the open facilities. In classical location models, the allocation cost is the distance between a client and an open facility. In this model, the costs are the allocation cost, transportation costs, and inventory costs. In order to address this problem the median algorithm is used to analyze inventory, evaluate supply chain status, monitor performance metrics at different levels of granularity, and detect potential problems and opportunities for improvement. The Euclidean distance data for some Ontario cities (demand nodes) are used to test the developed algorithm. Sitation software, lagrangian relaxation algorithm, and branch and bound heuristics are used to solve this model. Computational experiments confirm the efficiency of the proposed approach. Compared to the existing modeling and solution methods, the median algorithm approach not only provides a more general modeling framework but also leads to efficient solution times in general.

Keywords: approximate dynamic programming, facility location, perishable product, inventory model, blood platelet, P-median problem

Procedia PDF Downloads 485
213 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population

Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath

Abstract:

Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.

Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics

Procedia PDF Downloads 130