Search results for: real volume
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7737

Search results for: real volume

5157 Evaluation of University Students of a Video Game to Sensitize Young People about Mental Health Problems

Authors: Adolfo Cangas, Noelia Navarro

Abstract:

The current study shows the assessment made by university students of a video game entitled Stigma-Stop where the characters present different mental disorders. The objective is that players have more real information about mental disorders and empathize with them and thus reduce stigma. The sample consisted of 169 university students studying degrees related to education, social care and welfare (i.e., Social Education, Psychology, Early Childhood Education, Special Education, and Social Work). The participants valued the video game positively, especially in relation to utility, being somewhat lower the score awarded to the degree of entertainment. They detect the disorders and point out that in many occasions they felt the same (particularly in the case of depression, being lower in agoraphobia and bipolar disorder, and even lower in the case of schizophrenia), most students recommend the use of the video game. They emphasize that Stigma-Stop offers intervention strategies, information regarding the symptomatology and sensitizes against stigma.

Keywords: schizophrenia, social stigma, students, mental health

Procedia PDF Downloads 283
5156 Strategic Tools for Entrepreneurship: Model Proposal for Manufacturing Companies

Authors: Chiara Mansanta, Daniela Sani

Abstract:

The present paper presents the further development of the application of a standard methodology to boost innovation inside real case studies of manufacturing companies. The proposed methodology provides a viable solution for manufacturing companies that have to evaluate new business ideas. The study underlined the concept of entrepreneurship and how a manager can use it to promote innovation inside their companies. Starting from a literature study on entrepreneurship, this paper examines the role of the manager in supporting a company’s development. The empirical part of the study is based on two manufacturing companies that used the proposed methodology to favour entrepreneurship through an alternative approach. The research demonstrated the need for companies to have a structured and well-defined methodology to achieve their goals. The purpose of this article is to understand the significance of business models inside companies and explore how they affect business strategy and innovation management. The idea is to use business models to support entrepreneurs in their decision-making processes, reducing risks and avoiding errors.

Keywords: entrepreneurship, manufacturing companies, solution validation, strategic management

Procedia PDF Downloads 95
5155 Large-Scale Electroencephalogram Biometrics through Contrastive Learning

Authors: Mostafa ‘Neo’ Mohsenvand, Mohammad Rasool Izadi, Pattie Maes

Abstract:

EEG-based biometrics (user identification) has been explored on small datasets of no more than 157 subjects. Here we show that the accuracy of modern supervised methods falls rapidly as the number of users increases to a few thousand. Moreover, supervised methods require a large amount of labeled data for training which limits their applications in real-world scenarios where acquiring data for training should not take more than a few minutes. We show that using contrastive learning for pre-training, it is possible to maintain high accuracy on a dataset of 2130 subjects while only using a fraction of labels. We compare 5 different self-supervised tasks for pre-training of the encoder where our proposed method achieves the accuracy of 96.4%, improving the baseline supervised models by 22.75% and the competing self-supervised model by 3.93%. We also study the effects of the length of the signal and the number of channels on the accuracy of the user-identification models. Our results reveal that signals from temporal and frontal channels contain more identifying features compared to other channels.

Keywords: brainprint, contrastive learning, electroencephalo-gram, self-supervised learning, user identification

Procedia PDF Downloads 157
5154 Interactive Solutions for the Multi-Objective Capacitated Transportation Problem with Mixed Constraints under Fuzziness

Authors: Aquil Ahmed, Srikant Gupta, Irfan Ali

Abstract:

In this paper, we study a multi-objective capacitated transportation problem (MOCTP) with mixed constraints. This paper is comprised of the modelling and optimisation of an MOCTP in a fuzzy environment in which some goals are fractional and some are linear. In real life application of the fuzzy goal programming (FGP) problem with multiple objectives, it is difficult for the decision maker(s) to determine the goal value of each objective precisely as the goal values are imprecise or uncertain. Also, we developed the concept of linearization of fractional goal for solving the MOCTP. In this paper, imprecision of the parameter is handled by the concept of fuzzy set theory by considering these parameters as a trapezoidal fuzzy number. α-cut approach is used to get the crisp value of the parameters. Numerical examples are used to illustrate the method for solving MOCTP.

Keywords: capacitated transportation problem, multi objective linear programming, multi-objective fractional programming, fuzzy goal programming, fuzzy sets, trapezoidal fuzzy number

Procedia PDF Downloads 434
5153 The Application of Cellulose-Based Halloysite-Carbon Adsorbent to Remove Chloroxylenol from Water

Authors: Laura Frydel

Abstract:

Chloroxylenol is a common ingredient in disinfectants. Due to the use of this compound in large amounts, it is more and more often detected in rivers, sewage, and also in human body fluids. In recent years, there have been concerns about the potentially harmful effects of chloroxylenol on human health and the environment. This paper presents the synthesis, a brief characterization and the use of a halloysite-carbon adsorbent for the removal of chloroxylenol from water. The template in the halloysite-carbon adsorbent was acid treated bleached halloysite, and the carbon precursor was cellulose dissolved in zinc (II) chloride, which was dissolved in 37% hydrochloric acid. The FTIR spectra before and after the adsorption process allowed to determine the presence of functional groups, bonds in the halloysite-carbon composite, and the binding mechanism of the adsorbent and adsorbate. The morphology of the bleached halloysite sample and the sample of the halloysite-carbon adsorbent were characterized by scanning electron microscopy (SEM) with surface analysis by X-ray dispersion spectrometry (EDS). The specific surface area, total pore volume and mesopore and micropore volume were determined using the ASAP 2020 volumetric adsorption analyzer. Total carbon and total organic carbon were determined for the halloysite-carbon adsorbent. The halloysite-carbon adsorbent was used to remove chloroxylenol from water. The degree of removal of chloroxylenol from water using the halloysite-carbon adsorbent was about 90%. Adsorption studies show that the halloysite-carbon composite can be used as an effective adsorbent for removing chloroxylenol from water.

Keywords: adsorption, cellulose, chloroxylenol, halloysite

Procedia PDF Downloads 191
5152 Accurate Cortical Reconstruction in Narrow Sulci with Zero-Non-Zero Distance (ZNZD) Vector Field

Authors: Somojit Saha, Rohit K. Chatterjee, Sarit K. Das, Avijit Kar

Abstract:

A new force field is designed for propagation of the parametric contour into deep narrow cortical fold in the application of knowledge based reconstruction of cerebral cortex from MR image of brain. Designing of this force field is highly inspired by the Generalized Gradient Vector Flow (GGVF) model and markedly differs in manipulation of image information in order to determine the direction of propagation of the contour. While GGVF uses edge map as its main driving force, the newly designed force field uses the map of distance between zero valued pixels and their nearest non-zero valued pixel as its main driving force. Hence, it is called Zero-Non-Zero Distance (ZNZD) force field. The objective of this force field is forceful propagation of the contour beyond spurious convergence due to partial volume effect (PVE) in to narrow sulcal fold. Being function of the corresponding non-zero pixel value, the force field has got an inherent property to determine spuriousness of the edge automatically. It is effectively applied along with some morphological processing in the application of cortical reconstruction to breach the hindrance of PVE in narrow sulci where conventional GGVF fails.

Keywords: deformable model, external force field, partial volume effect, cortical reconstruction, MR image of brain

Procedia PDF Downloads 397
5151 NENU2PHAR: PHA-Based Materials from Micro-Algae for High-Volume Consumer Products

Authors: Enrique Moliner, Alba Lafarga, Isaac Herraiz, Evelina Castellana, Mihaela Mirea

Abstract:

NENU2PHAR (GA 887474) is an EU-funded project aimed at the development of polyhydroxyalkanoates (PHAs) from micro-algae. These biobased and biodegradable polymers are being tested and validated in different high-volume market applications including food packaging, cosmetic packaging, 3D printing filaments, agro-textiles and medical devices, counting on the support of key players like Danone, BEL Group, Sofradim or IFG. At the moment the project has achieved to produce PHAs from micro-algae with a cumulated yield around 17%, i.e. 1 kg PHAs produced from 5.8 kg micro-algae biomass, which in turn capture 11 kg CO₂ for growing up. These algae-based plastics can therefore offer the same environmental benefits than current bio-based plastics (reduction of greenhouse gas emissions and fossil resource depletion), using a 3rd generation biomass feedstock that avoids the competition with food and the environmental impacts of agricultural practices. The project is also dealing with other sustainability aspects like the ecodesign and life cycle assessment of the plastic products targeted, considering not only the use of the biobased plastics but also many other ecodesign strategies. This paper will present the main progresses and results achieved to date in the project.

Keywords: NENU2PHAR, Polyhydroxyalkanoates, micro-algae, biopolymer, ecodesign, life cycle assessment

Procedia PDF Downloads 91
5150 Investigation of Changes of Physical Properties of the Poplar Wood in Radial and Longitudinal Axis at Chaaloos Zone

Authors: Afshin Veisi

Abstract:

In this study, the physical properties of wood in poplar wood (Populous sp.) were analyzed in longitudinal and radial directions of the stem. Three Populous Alba tree were cut in chaloos zone and from each tree, 3 discs were selected at 130cm, half of tree and under of crown. The test samples from pith to bark (heartwood to sapwood) were prepared from these discs for measuring the involved properties such as, wet, dry and critical specific gravity, porosity, volume shrinkage and swelling based on the ASTM standard, and data in two radial and longitudinal directions in the trank were statistically analyzed. Such as, variations of wet, dry and critical specific gravity had in radial direction respectively: irregular increase, increase and increase, and in longitudinal direction respectively: irregular decrease, irregular increase and increase. Results of variations to moisture content and porosity show that in radial direction respectively: irregular increasing and decreasing, and in longitudinal direction from down to up respectively: irregular decreasing and stability. Volume shrinkage and swelling variations show in radial direction irregular and in longitudinal axial regular decreasing.

Keywords: poplar wood, physical properties, shrinkage, swelling, critical specific gravity, wet specific gravity, dry specific gravity

Procedia PDF Downloads 278
5149 The Role of Artificial Intelligence in Concrete Constructions

Authors: Ardalan Tofighi Soleimandarabi

Abstract:

Artificial intelligence has revolutionized the concrete construction industry and improved processes by increasing efficiency, accuracy, and sustainability. This article examines the applications of artificial intelligence in predicting the compressive strength of concrete, optimizing mixing plans, and improving structural health monitoring systems. Artificial intelligence-based models, such as artificial neural networks (ANN) and combined machine learning techniques, have shown better performance than traditional methods in predicting concrete properties. In addition, artificial intelligence systems have made it possible to improve quality control and real-time monitoring of structures, which helps in preventive maintenance and increases the life of infrastructure. Also, the use of artificial intelligence plays an effective role in sustainable construction by optimizing material consumption and reducing waste. Although the implementation of artificial intelligence is associated with challenges such as high initial costs and the need for specialized training, it will create a smarter, more sustainable, and more affordable future for concrete structures.

Keywords: artificial intelligence, concrete construction, compressive strength prediction, structural health monitoring, stability

Procedia PDF Downloads 15
5148 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data

Authors: Hyun-Woo Cho

Abstract:

It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.

Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring

Procedia PDF Downloads 244
5147 Strategic Role of Fintechs in Evolving Financial Functions and Enhancing Corporate Resilience amid Economic Crises

Authors: Ghizlane Barzi, Zineb Bamousse

Abstract:

In an increasingly volatile global economic context characterized by recurring crises, the financial function of companies is called upon to play a strategic role not only in resource management but also in organizational resilience. The emergence of financial technologies (fintech) offers innovative tools capable of transforming this function by enhancing the efficiency of financial processes and increasing companies' ability to adapt and overcome economic shocks. However, despite the rapid rise of fintechs and their growing adoption by companies, there remain uncertainties regarding the real impact of these innovations on the financial resilience of organizations. Indeed, how do fintech-driven innovations transform the financial function, and to what extent does this transformation contribute to strengthening the financial resilience of companies in the face of contemporary crises? This research aims to explore these questions by examining the interrelationships between the financial function, fintech innovations, and corporate resilience, in order to identify optimization levers that could be adopted for better financial risk management.

Keywords: finance, financial function, fintech, resilience, innovation

Procedia PDF Downloads 26
5146 Efficacy Of Tranexamic Acid On Blood Loss After Primary Total Hip Replacement : A Case-control Study In 154 Patients

Authors: Fedili Benamar, Belloulou Mohamed Lamine, Ouahes Hassane, Ghattas Samir

Abstract:

Introduction: Perioperative blood loss is a frequent cause of complications in total hip replacement (THR). The present prospective study assessed the efficacy of tranexamic acid (Exacyl(®)) in reducing blood loss in primary THR. Hypothesis: Tranexamic acid reduces blood loss in THR. Material and method: -This is a prospective randomized study on the effectiveness of Exacyl (tranexamic acid) in total hip replacement surgery performed on a standardized technique between 2019 and September 2022. -It involved 154 patients, of which 84 received a single injection of Exacyl (group 1) at a dosage of 10 mg/kg over 20 minutes during the perioperative period. -All patients received postoperative thromboprophylaxis with enoxaparin 0.4 ml subcutaneously. -All patients were admitted to the post-interventional intensive care unit for a duration of 24 hours for monitoring and pain management as per the service protocol. Results: 154 patients, of which 84 received a single injection of Exacyl (group 1) and 70 patients patients who did not receive Exacyl perioperatively : (Group 2 ) The average age is 57 +/- 15 years The distribution by gender was nearly equal with 56% male and 44% female; "The distribution according to the ASA score was as follows: 20.2% ASA1, 82.3% ASA2, and 17.5% ASA3. "There was a significant difference in the average volume of intraoperative and postoperative bleeding during the 48 hours." The average bleeding volume for group 1 (received Exacyl) was 614 ml +/- 228, while the average bleeding volume for group 2 was 729 +/- 300, with a chi-square test of 6.35 and a p-value < 0.01, which is highly significant. The ANOVA test showed an F-statistic of 7.11 and a p-value of 0.008. A Bartlett test revealed a chi-square of 6.35 and a p-value < 0.01." "In Group 1 (patients who received Exacyl), 73% had bleeding less than 750 ml (Group A), and 26% had bleeding exceeding 750 ml (Group B). In Group 2 (patients who did not receive Exacyl perioperatively), 52% had bleeding less than 750 ml (Group A), and 47% had bleeding exceeding 750 ml (Group B). "Thus, the use of Exacyl reduced perioperative bleeding and specifically decreased the risk of severe bleeding exceeding 750 ml by 43% with a relative risk (RR) of 1.37 and a p-value < 0.01. The transfusion rate was 1.19% in the population of Group 1 (Exacyl), whereas it was 10% in the population of Group 2 (no Exacyl). It can be stated that the use of Exacyl resulted in a reduction in perioperative blood transfusion with an RR of 0.1 and a p-value of 0.02. Conclusions: The use of Exacyl significantly reduced perioperative bleeding in this type of surgery.

Keywords: acid tranexamic, blood loss, anesthesia, total hip replacement, surgery

Procedia PDF Downloads 77
5145 What the Future Holds for Social Media Data Analysis

Authors: P. Wlodarczak, J. Soar, M. Ally

Abstract:

The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.

Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning

Procedia PDF Downloads 423
5144 Against the Philosophical-Scientific Racial Project of Biologizing Race

Authors: Anthony F. Peressini

Abstract:

The concept of race has recently come prominently back into discussion in the context of medicine and medical science, along with renewed effort to biologize racial concepts. This paper argues that this renewed effort to biologize race by way of medicine and population genetics fail on their own terms, and more importantly, that the philosophical project of biologizing race ought to be recognized for what it is—a retrograde racial project—and abandoned. There is clear agreement that standard racial categories and concepts cannot be grounded in the old way of racial naturalism, which understand race as a real, interest-independent biological/metaphysical category in which its members share “physical, moral, intellectual, and cultural characteristics.” But equally clear is the very real and pervasive presence of racial concepts in individual and collective consciousness and behavior, and so it remains a pressing area in which to seek deeper understanding. Recent philosophical work has endeavored to reconcile these two observations by developing a “thin” conception of race, grounded in scientific concepts but without the moral and metaphysical content. Such “thin,” science-based analyses take the “commonsense” or “folk” sense of race as it functions in contemporary society as the starting point for their philosophic-scientific projects to biologize racial concepts. A “philosophic-scientific analysis” is a special case of the cornerstone of analytic philosophy: a conceptual analysis. That is, a rendering of a concept into the more perspicuous concepts that constitute it. Thus a philosophic-scientific account of a concept is an attempt to work out an analysis of a concept that makes use of empirical science's insights to ground, legitimate and explicate the target concept in terms of clearer concepts informed by empirical results. The focus in this paper is on three recent philosophic-scientific cases for retaining “race” that all share this general analytic schema, but that make use of “medical necessity,” population genetics, and human genetic clustering, respectively. After arguing that each of these three approaches suffers from internal difficulties, the paper considers the general analytic schema employed by such biologizations of race. While such endeavors are inevitably prefaced with the disclaimer that the theory to follow is non-essentialist and non-racialist, the case will be made that such efforts are not neutral scientific or philosophical projects but rather are what sociologists call a racial project, that is, one of many competing efforts that conjoin a representation of what race means to specific efforts to determine social and institutional arrangements of power, resources, authority, etc. Accordingly, philosophic-scientific biologizations of race, since they begin from and condition their analyses on “folk” conceptions, cannot pretend to be “prior to” other disciplinary insights, nor to transcend the social-political dynamics involved in formulating theories of race. As a result, such traditional philosophical efforts can be seen to be disciplinarily parochial and to address only a caricature of a large and important human problem—and thereby further contributing to the unfortunate isolation of philosophical thinking about race from other disciplines.

Keywords: population genetics, ontology of race, race-based medicine, racial formation theory, racial projects, racism, social construction

Procedia PDF Downloads 273
5143 Effect of the Ratio, Weight, Treatment of Loofah Fiber on the Mechanical Properties of the Composite: Loofah Fiber Resin

Authors: F. Siahmed, A. Lounis, L. Faghi

Abstract:

The aim of this work is to study mechanical properties of composites based on fiber natural. This material has attracted attention of the scientific community for its mechanical properties, its moderate cost and its specification as regards the protection of environment. In this study the loofah part of the family of the natural fiber has been used for these significant mechanical properties. The fiber has porous structure, which facilitates the impregnation of the resin through these pores. The matrix used in this study is the type of unsaturated polyester. This resin was chosen for its resistance to long term.The work involves: -The chemical treatment of the fibers of loofah by NaOH solution (5%) -The realization of the composite resin / fiber loofah; The preparation of samples for testing -The tensile tests and bending -The observation of facies rupture by scanning electron microscopy The results obtained allow us to observe that the values of Young's modulus and tensile strength in tension is high and open up real prospects. The improvement in mechanical properties has been obtained for the two-layer composite fiber with 7.5% (by weight).

Keywords: loofah fiber, mechanical properties, composite, loofah fiber resin

Procedia PDF Downloads 447
5142 Optimal Reactive Power Dispatch under Various Contingency Conditions Using Whale Optimization Algorithm

Authors: Khaled Ben Oualid Medani, Samir Sayah

Abstract:

The Optimal Reactive Power Dispatch (ORPD) problem has been solved and analysed usually in the normal conditions. However, network collapses appear in contingency conditions. In this paper, ORPD under several contingencies is presented using the proposed method WOA. To ensure viability of the power system in contingency conditions, several critical cases are simulated in order to prevent and prepare the power system to face such situations. The results obtained are carried out in IEEE 30 bus test system for the solution of ORPD problem in which control of bus voltages, tap position of transformers and reactive power sources are involved. Moreover, another method, namely, Particle Swarm Optimization with Time Varying Acceleration Coefficient (PSO-TVAC) has been compared with the proposed technique. Simulation results indicate that the proposed WOA gives remarkable solution in terms of effectiveness in case of outages.

Keywords: optimal reactive power dispatch, power system analysis, real power loss minimization, contingency condition, metaheuristic technique, whale optimization algorithm

Procedia PDF Downloads 121
5141 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title

Authors: Gangmin Li, Fan Yang

Abstract:

Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.

Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting

Procedia PDF Downloads 55
5140 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format

Authors: Maryam Fallahpoor, Biswajeet Pradhan

Abstract:

Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.

Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format

Procedia PDF Downloads 88
5139 Understanding the Importance of Participation in the City Planning Process and Its Influencing Factors

Authors: Louis Nwachi

Abstract:

Urban planning systems in most countries still rely on expert-driven, top-down technocratic plan-making processes rather than a public and people-led process. This paper set out to evaluate the need for public participation in the plan-making process and to highlight the factors that affect public participation in the plan-making process. In doing this, it adopted a qualitative approach based on document review and interviews taken from real-world phenomena. A case study strategy using the Metropolitan Area of Abuja, the capital of Nigeria, as the study sample was used in carrying out the research. The research finds that participation is an important tool in the plan-making process and that public engagement in the process contributes to the identification of key urban issues that are unique to the specific local areas, thereby contributing to the establishment of priorities and, in turn, to the mobilization of resources to meet the identified needs. It also finds that the development of a participation model by city authorities encourages public engagement and helps to develop trust between those in authority and the different key stakeholder groups involved in the plan-making process.

Keywords: plan-making, participation, urban planning, city

Procedia PDF Downloads 101
5138 Securing Health Monitoring in Internet of Things with Blockchain-Based Proxy Re-Encryption

Authors: Jerlin George, R. Chitra

Abstract:

The devices with sensors that can monitor your temperature, heart rate, and other vital signs and link to the internet, known as the Internet of Things (IoT), have completely transformed the way we control health. Providing real-time health data, these sensors improve diagnostics and treatment outcomes. Security and privacy matters when IoT comes into play in healthcare. Cyberattacks on centralized database systems are also a problem. To solve these challenges, the study uses blockchain technology coupled with proxy re-encryption to secure health data. ThingSpeak IoT cloud analyzes the collected data and turns them into blockchain transactions which are safely kept on the DriveHQ cloud. Transparency and data integrity are ensured by blockchain, and secure data sharing among authorized users is made possible by proxy re-encryption. This results in a health monitoring system that preserves the accuracy and confidentiality of data while reducing the safety risks of IoT-driven healthcare applications.

Keywords: internet of things, healthcare, sensors, electronic health records, blockchain, proxy re-encryption, data privacy, data security

Procedia PDF Downloads 18
5137 Investigation the Effect of Velocity Inlet and Carrying Fluid on the Flow inside Coronary Artery

Authors: Mohammadreza Nezamirad, Nasim Sabetpour, Azadeh Yazdi, Amirmasoud Hamedi

Abstract:

In this study OpenFOAM 4.4.2 was used to investigate flow inside the coronary artery of the heart. This step is the first step of our future project, which is to include conjugate heat transfer of the heart with three main coronary arteries. Three different velocities were used as inlet boundary conditions to see the effect of velocity increase on velocity, pressure, and wall shear of the coronary artery. Also, three different fluids, namely the University of Wisconsin solution, gelatin, and blood was used to investigate the effect of different fluids on flow inside the coronary artery. A code based on Reynolds Stress Navier Stokes (RANS) equations was written and implemented with the real boundary condition that was calculated based on MRI images. In order to improve the accuracy of the current numerical scheme, hex dominant mesh is utilized. When the inlet velocity increases to 0.5 m/s, velocity, wall shear stress, and pressure increase at the narrower parts.

Keywords: CFD, simulation, OpenFOAM, heart

Procedia PDF Downloads 149
5136 Pricing Techniques to Mitigate Recurring Congestion on Interstate Facilities Using Dynamic Feedback Assignment

Authors: Hatem Abou-Senna

Abstract:

Interstate 4 (I-4) is a primary east-west transportation corridor between Tampa and Daytona cities, serving commuters, commercial and recreational traffic. I-4 is known to have severe recurring congestion during peak hours. The congestion spans about 11 miles in the evening peak period in the central corridor area as it is considered the only non-tolled limited access facility connecting the Orlando Central Business District (CBD) and the tourist attractions area (Walt Disney World). Florida officials had been skeptical of tolling I-4 prior to the recent legislation, and the public through the media had been complaining about the excessive toll facilities in Central Florida. So, in search for plausible mitigation to the congestion on the I-4 corridor, this research is implemented to evaluate the effectiveness of different toll pricing alternatives that might divert traffic from I-4 to the toll facilities during the peak period. The network is composed of two main diverging limited access highways, freeway (I-4) and toll road (SR 417) in addition to two east-west parallel toll roads SR 408 and SR 528, intersecting the above-mentioned highways from both ends. I-4 and toll road SR 408 are the most frequently used route by commuters. SR-417 is a relatively uncongested toll road with 15 miles longer than I-4 and $5 tolls compared to no monetary cost on 1-4 for the same trip. The results of the calibrated Orlando PARAMICS network showed that percentages of route diversion vary from one route to another and depends primarily on the travel cost between specific origin-destination (O-D) pairs. Most drivers going from Disney (O1) or Lake Buena Vista (O2) to Lake Mary (D1) were found to have a high propensity towards using I-4, even when eliminating tolls and/or providing real-time information. However, a diversion from I-4 to SR 417 for these OD pairs occurred only in the cases of the incident and lane closure on I-4, due to the increase in delay and travel costs, and when information is provided to travelers. Furthermore, drivers that diverted from I-4 to SR 417 and SR 528 did not gain significant travel-time savings. This was attributed to the limited extra capacity of the alternative routes in the peak period and the longer traveling distance. When the remaining origin-destination pairs were analyzed, average travel time savings on I-4 ranged between 10 and 16% amounting to 10 minutes at the most with a 10% increase in the network average speed. High propensity of diversion on the network increased significantly when eliminating tolls on SR 417 and SR 528 while doubling the tolls on SR 408 along with the incident and lane closure scenarios on I-4 and with real-time information provided. The toll roads were found to be a viable alternative to I-4 for these specific OD pairs depending on the user perception of the toll cost which was reflected in their specific travel times. However, on the macroscopic level, it was concluded that route diversion through toll reduction or elimination on surrounding toll roads would only have a minimum impact on reducing I-4 congestion during the peak period.

Keywords: congestion pricing, dynamic feedback assignment, microsimulation, paramics, route diversion

Procedia PDF Downloads 178
5135 The Temporal Dimension of Narratives: A Construct of Qualitative Time

Authors: Ani Thomas

Abstract:

Every narrative is a temporal construct. Every narrative creates a qualitative experience of time for the viewer. The paper argues for the concept of a qualified time that emerges from the interaction between the narrative and the audience. The paper also challenges the conventional understanding of narrative time as either story time, real time or discourse time. Looking at narratives through the medium of Cinema, the study examines how narratives create and manipulate duration or durée, the qualitative experience of time as theorized by Henri Bergson. The paper further analyzes how Cinema and, by extension, narratives are nothing but Durée and the filmmaker, the artist of durée, who shape and manipulate the perception and emotions of the viewer through the manipulation and construction of durée. The paper draws on cinematic works to look at the techniques to demonstrate how filmmakers use, for example, editing, sound, compositional and production narratives etc., to create various modes of durée that challenge, amplify or unsettle the viewer’s sense of time. Bringing together the Viewer’s durée and exploring its interaction with the narrative construct, the paper explores the emergence of the new qualitative time, the narrative durée, that defines the audience experience.

Keywords: cinema, time, bergson, duree

Procedia PDF Downloads 148
5134 A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem

Authors: Watchara Songserm, Teeradej Wuttipornpun

Abstract:

This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.

Keywords: capacitated MRP, genetic algorithm, linear programming, automotive industries, flow shop, application in industry

Procedia PDF Downloads 490
5133 Augmenting History: Case Study Measuring Motivation of Students Using Augmented Reality Apps in History Classes

Authors: Kevin. S. Badni

Abstract:

Due to the rapid advances in the use of information technology and students’ familiarity with technology, learning styles in higher education are being reshaped. One of the technology developments that has gained considerable attention in recent years is Augmented Reality (AR), where technology is used to combine overlays of digital data on physical real-world settings. While AR is being heavily promoted for entertainment by mobile phone manufacturers, it has had little adoption in higher education due to the required upfront investment that an instructor needs to undertake in creating relevant AR applications. This paper discusses a case study that uses a low upfront development approach and examines the impact on generation-Z students’ motivation whilst studying design history over a four-semester period. Even though the upfront investment in creating the AR support was minimal, the results showed a noticeable increase in student motivation. The approach used in this paper can be easily transferred to other disciplines and other areas of design education.

Keywords: augmented reality, history, motivation, technology

Procedia PDF Downloads 166
5132 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions

Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad

Abstract:

This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.

Keywords: fragility analysis, seismic performance, tunnel lining, vulnerability

Procedia PDF Downloads 315
5131 Seismic Performance of Reinforced Concrete Frame Structure Based on Plastic Rotation

Authors: Kahil Amar, Meziani Faroudja, Khelil Nacim

Abstract:

The principal objective of this study is the evaluation of the seismic performance of reinforced concrete frame structures, taking into account of the behavior laws, reflecting the real behavior of materials, using CASTEM2000 software. A finite element model used is based in modified Takeda model with Timoshenko elements for columns and beams. This model is validated on a Vecchio experimental reinforced concrete (RC) frame model. Then, a study focused on the behavior of a RC frame with three-level and three-story in order to visualize the positioning the plastic hinge (plastic rotation), determined from the curvature distribution along the elements. The results obtained show that the beams of the 1st and 2nd level developed a very large plastic rotations, or these rotations exceed the values corresponding to CP (Collapse prevention with cp qCP = 0.02 rad), against those developed at the 3rd level, are between IO and LS (Immediate occupancy and life Safety with qIO = 0.005 rad and rad qLS = 0.01 respectively), so the beams of first and second levels submit a very significant damage.

Keywords: seismic performance, performance level, pushover analysis, plastic rotation, plastic hinge

Procedia PDF Downloads 130
5130 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge

Authors: M. F. Yilmaz, B. Ö. Çağlayan

Abstract:

Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.

Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures

Procedia PDF Downloads 357
5129 Effect of Plastic Deformation on the Carbide-Free Bainite Transformation in Medium C-Si Steel

Authors: Mufath Zorgani, Carlos Garcia-Mateo, Mohammad Jahazi

Abstract:

In this study, the influence of pre-strained austenite on the extent of isothermal bainite transformation in medium-carbon, high-silicon steel was investigated. Different amounts of deformations were applied at 600°C on the austenite right before quenching to the region, where isothermal bainitic transformation is activated. Four different temperatures of 325, 350, 375, and 400°C considering similar holding time 1800s at each temperature, were selected to investigate the extent of isothermal bainitic transformation. The results showed that the deformation-free austenite transforms to the higher volume fraction of CFB bainite when the isothermal transformation temperature reduced from 400 to 325°C, the introduction of plastic deformation in austenite prior to the formation of bainite invariably involves a delay of the same or identical isothermal treatment. On the other side, when the isothermal transformation temperature and deformation increases, the volume fraction and the plate thickness of bainite decreases and the amount of retained austenite increases. The shape of retained austenite is mostly representing blocky-shape one due to the less amount of transformed bainite. Moreover, the plate-like shape bainite cannot be resolved when the deformation amount reached 30%, and the isothermal transformation temperatures are of 375 and 400°C. The amount of retained austenite and the percentage of its transformation to martensite during the final cooling stage play a significant role in the variation of hardness level for different thermomechanical regimes.

Keywords: ausforming, carbide free bainite, dilatometry, microstructure

Procedia PDF Downloads 128
5128 Automatic Detection of Proliferative Cells in Immunohistochemically Images of Meningioma Using Fuzzy C-Means Clustering and HSV Color Space

Authors: Vahid Anari, Mina Bakhshi

Abstract:

Visual search and identification of immunohistochemically stained tissue of meningioma was performed manually in pathologic laboratories to detect and diagnose the cancers type of meningioma. This task is very tedious and time-consuming. Moreover, because of cell's complex nature, it still remains a challenging task to segment cells from its background and analyze them automatically. In this paper, we develop and test a computerized scheme that can automatically identify cells in microscopic images of meningioma and classify them into positive (proliferative) and negative (normal) cells. Dataset including 150 images are used to test the scheme. The scheme uses Fuzzy C-means algorithm as a color clustering method based on perceptually uniform hue, saturation, value (HSV) color space. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.

Keywords: positive cell, color segmentation, HSV color space, immunohistochemistry, meningioma, thresholding, fuzzy c-means

Procedia PDF Downloads 210