Search results for: real rewards
3116 Structural Analysis on the Composition of Video Game Virtual Spaces
Authors: Qin Luofeng, Shen Siqi
Abstract:
For the 58 years since the first video game came into being, the video game industry is getting through an explosive evolution from then on. Video games exert great influence on society and become a reflection of public life to some extent. Video game virtual spaces are where activities are taking place like real spaces. And that’s the reason why some architects pay attention to video games. However, compared to the researches on the appearance of games, we observe a lack of theoretical comprehensive on the construction of video game virtual spaces. The research method of this paper is to collect literature and conduct theoretical research about the virtual space in video games firstly. And then analogizing the opinions on the space phenomena from the theory of literature and films. Finally, this paper proposes a three-layer framework for the construction of video game virtual spaces: “algorithmic space-narrative space players space”, which correspond to the exterior, expressive, affective parts of the game space. Also, we illustrate each sub-space according to numerous instances of published video games. Hoping this writing could promote the interactive development of video games and architecture.Keywords: video game, virtual space, narrativity, social space, emotional connection
Procedia PDF Downloads 2663115 Uncertainty Estimation in Neural Networks through Transfer Learning
Authors: Ashish James, Anusha James
Abstract:
The impressive predictive performance of deep learning techniques on a wide range of tasks has led to its widespread use. Estimating the confidence of these predictions is paramount for improving the safety and reliability of such systems. However, the uncertainty estimates provided by neural networks (NNs) tend to be overconfident and unreasonable. Ensemble of NNs typically produce good predictions but uncertainty estimates tend to be inconsistent. Inspired by these, this paper presents a framework that can quantitatively estimate the uncertainties by leveraging the advances in transfer learning through slight modification to the existing training pipelines. This promising algorithm is developed with an intention of deployment in real world problems which already boast a good predictive performance by reusing those pretrained models. The idea is to capture the behavior of the trained NNs for the base task by augmenting it with the uncertainty estimates from a supplementary network. A series of experiments with known and unknown distributions show that the proposed approach produces well calibrated uncertainty estimates with high quality predictions.Keywords: uncertainty estimation, neural networks, transfer learning, regression
Procedia PDF Downloads 1343114 Deployed Confidence: The Testing in Production
Authors: Shreya Asthana
Abstract:
Testers know that the feature they tested on stage is working perfectly in production only after release went live. Sometimes something breaks in production and testers get to know through the end user’s bug raised. The panic mode starts when your staging test results do not reflect current production behavior. And you started doubting your testing skills when finally the user reported a bug to you. Testers can deploy their confidence on release day by testing on production. Once you start doing testing in production, you will see test result accuracy because it will be running on real time data and execution will be a little faster as compared to staging one due to elimination of bad data. Feature flagging, canary releases, and data cleanup can help to achieve this technique of testing. By this paper it will be easier to understand the steps to achieve production testing before making your feature live, and to modify IT company’s testing procedure, so testers can provide the bug free experience to the end users. This study is beneficial because too many people think that testing should be done in staging but not in production and now this is high time to pull out people from their old mindset of testing into a new testing world. At the end of the day, it all just matters if the features are working in production or not.Keywords: bug free production, new testing mindset, testing strategy, testing approach
Procedia PDF Downloads 753113 Network Connectivity Knowledge Graph Using Dwave Quantum Hybrid Solvers
Authors: Nivedha Rajaram
Abstract:
Hybrid Quantum solvers have been given prime focus in recent days by computation problem-solving domain industrial applications. D’Wave Quantum Computers are one such paragon of systems built using quantum annealing mechanism. Discrete Quadratic Models is a hybrid quantum computing model class supplied by D’Wave Ocean SDK - a real-time software platform for hybrid quantum solvers. These hybrid quantum computing modellers can be employed to solve classic problems. One such problem that we consider in this paper is finding a network connectivity knowledge hub in a huge network of systems. Using this quantum solver, we try to find out the prime system hub, which acts as a supreme connection point for the set of connected computers in a large network. This paper establishes an innovative problem approach to generate a connectivity system hub plot for a set of systems using DWave ocean SDK hybrid quantum solvers.Keywords: quantum computing, hybrid quantum solver, DWave annealing, network knowledge graph
Procedia PDF Downloads 1253112 The Solvent Extraction of Uranium, Plutonium and Thorium from Aqueous Solution by 1-Hydroxyhexadecylidene-1,1-Diphosphonic Acid
Authors: M. Bouhoun Ali, A. Y. Badjah Hadj Ahmed, M. Attou, A. Elias, M. A. Didi
Abstract:
In this paper, the solvent extraction of uranium(VI), plutonium(IV) and thorium(IV) from aqueous solutions using 1-hydroxyhexadecylidene-1,1-diphosphonic acid (HHDPA) in treated kerosene has been investigated. The HHDPA was previously synthesized and characterized by FT-IR, 1H NMR, 31P NMR spectroscopy and elemental analysis. The effects contact time, initial pH, initial metal concentration, aqueous/organic phase ratio, extractant concentration and temperature on the extraction process have been studied. An empirical modelling was performed by using a 25 full factorial design, and regression equation for extraction metals was determined from the data. The conventional log-log analysis of the extraction data reveals that ratios of extractant to extracted U(VI), Pu(IV) and Th(IV) are 1:1, 1:2 and 1:2, respectively. Thermodynamic parameters showed that the extraction process was exothermic heat and spontaneous. The obtained optimal parameters were applied to real effluents containing uranium(VI), plutonium(IV) and thorium(IV) ions.Keywords: solvent extraction, uranium, plutonium, thorium, 1-hydroxyhexadecylidene-1-1-diphosphonic acid, aqueous solution
Procedia PDF Downloads 2873111 Barred from Each Other: Why Normative Husbands Remain Married to Incarcerated Wives
Authors: Tomer Einat, Sharon Rabinovitz, Inbal Harel-Aviram
Abstract:
This study explores men’s motivation and justification to remain married to their criminal, imprisoned wives. Using semi-structured interviews and content-analysis, data were collected and analyzed from eight men who maintain stable marriage relationships with their incarcerated wives. Participants are normative men who describe incarceration as a challenge that enhances mutual responsibility and commitment. They exaggerate the extent to which their partners resemble archetypal romantic ideals. They use motivational accounts to explain the woman’s criminal conduct, which is perceived as non-relevant to her real identity. Physical separation and lack of physical intimacy are perceived as the major difficulties in maintaining their marriage relations. Length of imprisonment and marriage was found to be related to the decision whether to continue or terminate the relationships. Women-inmates’ partners experience difficulties and use coping strategies very similar to those cited by other normative spouses facing lengthy separation.Keywords: female inmates, marriage, normative spouses, romantic accounts
Procedia PDF Downloads 4613110 Emerging Threats and Adaptive Defenses: Navigating the Future of Cybersecurity in a Hyperconnected World
Authors: Olasunkanmi Jame Ayodeji, Adebayo Adeyinka Victor
Abstract:
In a hyperconnected world, cybersecurity faces a continuous evolution of threats that challenge traditional defence mechanisms. This paper explores emerging cybersecurity threats like malware, ransomware, phishing, social engineering, and the Internet of Things (IoT) vulnerabilities. It delves into the inadequacies of existing cybersecurity defences in addressing these evolving risks and advocates for adaptive defence mechanisms that leverage AI, machine learning, and zero-trust architectures. The paper proposes collaborative approaches, including public-private partnerships and information sharing, as essential to building a robust defence strategy to address future cyber threats. The need for continuous monitoring, real-time incident response, and adaptive resilience strategies is highlighted to fortify digital infrastructures in the face of escalating global cyber risks.Keywords: cybersecurity, hyperconnectivity, malware, adaptive defences, zero-trust architecture, internet of things vulnerabilities
Procedia PDF Downloads 193109 Breast Cancer Mortality and Comorbidities in Portugal: A Predictive Model Built with Real World Data
Authors: Cecília M. Antão, Paulo Jorge Nogueira
Abstract:
Breast cancer (BC) is the first cause of cancer mortality among Portuguese women. This retrospective observational study aimed at identifying comorbidities associated with BC female patients admitted to Portuguese public hospitals (2010-2018), investigating the effect of comorbidities on BC mortality rate, and building a predictive model using logistic regression. Results showed that the BC mortality in Portugal decreased in this period and reached 4.37% in 2018. Adjusted odds ratio indicated that secondary malignant neoplasms of liver, of bone and bone marrow, congestive heart failure, and diabetes were associated with an increased chance of dying from breast cancer. Although the Lisbon district (the most populated area) accounted for the largest percentage of BC patients, the logistic regression model showed that, besides patient’s age, being resident in Bragança, Castelo Branco, or Porto districts was directly associated with an increase of the mortality rate.Keywords: breast cancer, comorbidities, logistic regression, adjusted odds ratio
Procedia PDF Downloads 863108 Evaluation of University Students of a Video Game to Sensitize Young People about Mental Health Problems
Authors: Adolfo Cangas, Noelia Navarro
Abstract:
The current study shows the assessment made by university students of a video game entitled Stigma-Stop where the characters present different mental disorders. The objective is that players have more real information about mental disorders and empathize with them and thus reduce stigma. The sample consisted of 169 university students studying degrees related to education, social care and welfare (i.e., Social Education, Psychology, Early Childhood Education, Special Education, and Social Work). The participants valued the video game positively, especially in relation to utility, being somewhat lower the score awarded to the degree of entertainment. They detect the disorders and point out that in many occasions they felt the same (particularly in the case of depression, being lower in agoraphobia and bipolar disorder, and even lower in the case of schizophrenia), most students recommend the use of the video game. They emphasize that Stigma-Stop offers intervention strategies, information regarding the symptomatology and sensitizes against stigma.Keywords: schizophrenia, social stigma, students, mental health
Procedia PDF Downloads 2813107 Strategic Tools for Entrepreneurship: Model Proposal for Manufacturing Companies
Authors: Chiara Mansanta, Daniela Sani
Abstract:
The present paper presents the further development of the application of a standard methodology to boost innovation inside real case studies of manufacturing companies. The proposed methodology provides a viable solution for manufacturing companies that have to evaluate new business ideas. The study underlined the concept of entrepreneurship and how a manager can use it to promote innovation inside their companies. Starting from a literature study on entrepreneurship, this paper examines the role of the manager in supporting a company’s development. The empirical part of the study is based on two manufacturing companies that used the proposed methodology to favour entrepreneurship through an alternative approach. The research demonstrated the need for companies to have a structured and well-defined methodology to achieve their goals. The purpose of this article is to understand the significance of business models inside companies and explore how they affect business strategy and innovation management. The idea is to use business models to support entrepreneurs in their decision-making processes, reducing risks and avoiding errors.Keywords: entrepreneurship, manufacturing companies, solution validation, strategic management
Procedia PDF Downloads 943106 Large-Scale Electroencephalogram Biometrics through Contrastive Learning
Authors: Mostafa ‘Neo’ Mohsenvand, Mohammad Rasool Izadi, Pattie Maes
Abstract:
EEG-based biometrics (user identification) has been explored on small datasets of no more than 157 subjects. Here we show that the accuracy of modern supervised methods falls rapidly as the number of users increases to a few thousand. Moreover, supervised methods require a large amount of labeled data for training which limits their applications in real-world scenarios where acquiring data for training should not take more than a few minutes. We show that using contrastive learning for pre-training, it is possible to maintain high accuracy on a dataset of 2130 subjects while only using a fraction of labels. We compare 5 different self-supervised tasks for pre-training of the encoder where our proposed method achieves the accuracy of 96.4%, improving the baseline supervised models by 22.75% and the competing self-supervised model by 3.93%. We also study the effects of the length of the signal and the number of channels on the accuracy of the user-identification models. Our results reveal that signals from temporal and frontal channels contain more identifying features compared to other channels.Keywords: brainprint, contrastive learning, electroencephalo-gram, self-supervised learning, user identification
Procedia PDF Downloads 1553105 Interactive Solutions for the Multi-Objective Capacitated Transportation Problem with Mixed Constraints under Fuzziness
Authors: Aquil Ahmed, Srikant Gupta, Irfan Ali
Abstract:
In this paper, we study a multi-objective capacitated transportation problem (MOCTP) with mixed constraints. This paper is comprised of the modelling and optimisation of an MOCTP in a fuzzy environment in which some goals are fractional and some are linear. In real life application of the fuzzy goal programming (FGP) problem with multiple objectives, it is difficult for the decision maker(s) to determine the goal value of each objective precisely as the goal values are imprecise or uncertain. Also, we developed the concept of linearization of fractional goal for solving the MOCTP. In this paper, imprecision of the parameter is handled by the concept of fuzzy set theory by considering these parameters as a trapezoidal fuzzy number. α-cut approach is used to get the crisp value of the parameters. Numerical examples are used to illustrate the method for solving MOCTP.Keywords: capacitated transportation problem, multi objective linear programming, multi-objective fractional programming, fuzzy goal programming, fuzzy sets, trapezoidal fuzzy number
Procedia PDF Downloads 4343104 The Role of Artificial Intelligence in Concrete Constructions
Authors: Ardalan Tofighi Soleimandarabi
Abstract:
Artificial intelligence has revolutionized the concrete construction industry and improved processes by increasing efficiency, accuracy, and sustainability. This article examines the applications of artificial intelligence in predicting the compressive strength of concrete, optimizing mixing plans, and improving structural health monitoring systems. Artificial intelligence-based models, such as artificial neural networks (ANN) and combined machine learning techniques, have shown better performance than traditional methods in predicting concrete properties. In addition, artificial intelligence systems have made it possible to improve quality control and real-time monitoring of structures, which helps in preventive maintenance and increases the life of infrastructure. Also, the use of artificial intelligence plays an effective role in sustainable construction by optimizing material consumption and reducing waste. Although the implementation of artificial intelligence is associated with challenges such as high initial costs and the need for specialized training, it will create a smarter, more sustainable, and more affordable future for concrete structures.Keywords: artificial intelligence, concrete construction, compressive strength prediction, structural health monitoring, stability
Procedia PDF Downloads 143103 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data
Authors: Hyun-Woo Cho
Abstract:
It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring
Procedia PDF Downloads 2413102 Strategic Role of Fintechs in Evolving Financial Functions and Enhancing Corporate Resilience amid Economic Crises
Authors: Ghizlane Barzi, Zineb Bamousse
Abstract:
In an increasingly volatile global economic context characterized by recurring crises, the financial function of companies is called upon to play a strategic role not only in resource management but also in organizational resilience. The emergence of financial technologies (fintech) offers innovative tools capable of transforming this function by enhancing the efficiency of financial processes and increasing companies' ability to adapt and overcome economic shocks. However, despite the rapid rise of fintechs and their growing adoption by companies, there remain uncertainties regarding the real impact of these innovations on the financial resilience of organizations. Indeed, how do fintech-driven innovations transform the financial function, and to what extent does this transformation contribute to strengthening the financial resilience of companies in the face of contemporary crises? This research aims to explore these questions by examining the interrelationships between the financial function, fintech innovations, and corporate resilience, in order to identify optimization levers that could be adopted for better financial risk management.Keywords: finance, financial function, fintech, resilience, innovation
Procedia PDF Downloads 253101 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning
Procedia PDF Downloads 4223100 Against the Philosophical-Scientific Racial Project of Biologizing Race
Authors: Anthony F. Peressini
Abstract:
The concept of race has recently come prominently back into discussion in the context of medicine and medical science, along with renewed effort to biologize racial concepts. This paper argues that this renewed effort to biologize race by way of medicine and population genetics fail on their own terms, and more importantly, that the philosophical project of biologizing race ought to be recognized for what it is—a retrograde racial project—and abandoned. There is clear agreement that standard racial categories and concepts cannot be grounded in the old way of racial naturalism, which understand race as a real, interest-independent biological/metaphysical category in which its members share “physical, moral, intellectual, and cultural characteristics.” But equally clear is the very real and pervasive presence of racial concepts in individual and collective consciousness and behavior, and so it remains a pressing area in which to seek deeper understanding. Recent philosophical work has endeavored to reconcile these two observations by developing a “thin” conception of race, grounded in scientific concepts but without the moral and metaphysical content. Such “thin,” science-based analyses take the “commonsense” or “folk” sense of race as it functions in contemporary society as the starting point for their philosophic-scientific projects to biologize racial concepts. A “philosophic-scientific analysis” is a special case of the cornerstone of analytic philosophy: a conceptual analysis. That is, a rendering of a concept into the more perspicuous concepts that constitute it. Thus a philosophic-scientific account of a concept is an attempt to work out an analysis of a concept that makes use of empirical science's insights to ground, legitimate and explicate the target concept in terms of clearer concepts informed by empirical results. The focus in this paper is on three recent philosophic-scientific cases for retaining “race” that all share this general analytic schema, but that make use of “medical necessity,” population genetics, and human genetic clustering, respectively. After arguing that each of these three approaches suffers from internal difficulties, the paper considers the general analytic schema employed by such biologizations of race. While such endeavors are inevitably prefaced with the disclaimer that the theory to follow is non-essentialist and non-racialist, the case will be made that such efforts are not neutral scientific or philosophical projects but rather are what sociologists call a racial project, that is, one of many competing efforts that conjoin a representation of what race means to specific efforts to determine social and institutional arrangements of power, resources, authority, etc. Accordingly, philosophic-scientific biologizations of race, since they begin from and condition their analyses on “folk” conceptions, cannot pretend to be “prior to” other disciplinary insights, nor to transcend the social-political dynamics involved in formulating theories of race. As a result, such traditional philosophical efforts can be seen to be disciplinarily parochial and to address only a caricature of a large and important human problem—and thereby further contributing to the unfortunate isolation of philosophical thinking about race from other disciplines.Keywords: population genetics, ontology of race, race-based medicine, racial formation theory, racial projects, racism, social construction
Procedia PDF Downloads 2733099 Effect of the Ratio, Weight, Treatment of Loofah Fiber on the Mechanical Properties of the Composite: Loofah Fiber Resin
Authors: F. Siahmed, A. Lounis, L. Faghi
Abstract:
The aim of this work is to study mechanical properties of composites based on fiber natural. This material has attracted attention of the scientific community for its mechanical properties, its moderate cost and its specification as regards the protection of environment. In this study the loofah part of the family of the natural fiber has been used for these significant mechanical properties. The fiber has porous structure, which facilitates the impregnation of the resin through these pores. The matrix used in this study is the type of unsaturated polyester. This resin was chosen for its resistance to long term.The work involves: -The chemical treatment of the fibers of loofah by NaOH solution (5%) -The realization of the composite resin / fiber loofah; The preparation of samples for testing -The tensile tests and bending -The observation of facies rupture by scanning electron microscopy The results obtained allow us to observe that the values of Young's modulus and tensile strength in tension is high and open up real prospects. The improvement in mechanical properties has been obtained for the two-layer composite fiber with 7.5% (by weight).Keywords: loofah fiber, mechanical properties, composite, loofah fiber resin
Procedia PDF Downloads 4453098 Optimal Reactive Power Dispatch under Various Contingency Conditions Using Whale Optimization Algorithm
Authors: Khaled Ben Oualid Medani, Samir Sayah
Abstract:
The Optimal Reactive Power Dispatch (ORPD) problem has been solved and analysed usually in the normal conditions. However, network collapses appear in contingency conditions. In this paper, ORPD under several contingencies is presented using the proposed method WOA. To ensure viability of the power system in contingency conditions, several critical cases are simulated in order to prevent and prepare the power system to face such situations. The results obtained are carried out in IEEE 30 bus test system for the solution of ORPD problem in which control of bus voltages, tap position of transformers and reactive power sources are involved. Moreover, another method, namely, Particle Swarm Optimization with Time Varying Acceleration Coefficient (PSO-TVAC) has been compared with the proposed technique. Simulation results indicate that the proposed WOA gives remarkable solution in terms of effectiveness in case of outages.Keywords: optimal reactive power dispatch, power system analysis, real power loss minimization, contingency condition, metaheuristic technique, whale optimization algorithm
Procedia PDF Downloads 1203097 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title
Authors: Gangmin Li, Fan Yang
Abstract:
Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting
Procedia PDF Downloads 523096 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format
Authors: Maryam Fallahpoor, Biswajeet Pradhan
Abstract:
Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format
Procedia PDF Downloads 853095 Understanding the Importance of Participation in the City Planning Process and Its Influencing Factors
Authors: Louis Nwachi
Abstract:
Urban planning systems in most countries still rely on expert-driven, top-down technocratic plan-making processes rather than a public and people-led process. This paper set out to evaluate the need for public participation in the plan-making process and to highlight the factors that affect public participation in the plan-making process. In doing this, it adopted a qualitative approach based on document review and interviews taken from real-world phenomena. A case study strategy using the Metropolitan Area of Abuja, the capital of Nigeria, as the study sample was used in carrying out the research. The research finds that participation is an important tool in the plan-making process and that public engagement in the process contributes to the identification of key urban issues that are unique to the specific local areas, thereby contributing to the establishment of priorities and, in turn, to the mobilization of resources to meet the identified needs. It also finds that the development of a participation model by city authorities encourages public engagement and helps to develop trust between those in authority and the different key stakeholder groups involved in the plan-making process.Keywords: plan-making, participation, urban planning, city
Procedia PDF Downloads 1013094 Securing Health Monitoring in Internet of Things with Blockchain-Based Proxy Re-Encryption
Authors: Jerlin George, R. Chitra
Abstract:
The devices with sensors that can monitor your temperature, heart rate, and other vital signs and link to the internet, known as the Internet of Things (IoT), have completely transformed the way we control health. Providing real-time health data, these sensors improve diagnostics and treatment outcomes. Security and privacy matters when IoT comes into play in healthcare. Cyberattacks on centralized database systems are also a problem. To solve these challenges, the study uses blockchain technology coupled with proxy re-encryption to secure health data. ThingSpeak IoT cloud analyzes the collected data and turns them into blockchain transactions which are safely kept on the DriveHQ cloud. Transparency and data integrity are ensured by blockchain, and secure data sharing among authorized users is made possible by proxy re-encryption. This results in a health monitoring system that preserves the accuracy and confidentiality of data while reducing the safety risks of IoT-driven healthcare applications.Keywords: internet of things, healthcare, sensors, electronic health records, blockchain, proxy re-encryption, data privacy, data security
Procedia PDF Downloads 133093 Investigation the Effect of Velocity Inlet and Carrying Fluid on the Flow inside Coronary Artery
Authors: Mohammadreza Nezamirad, Nasim Sabetpour, Azadeh Yazdi, Amirmasoud Hamedi
Abstract:
In this study OpenFOAM 4.4.2 was used to investigate flow inside the coronary artery of the heart. This step is the first step of our future project, which is to include conjugate heat transfer of the heart with three main coronary arteries. Three different velocities were used as inlet boundary conditions to see the effect of velocity increase on velocity, pressure, and wall shear of the coronary artery. Also, three different fluids, namely the University of Wisconsin solution, gelatin, and blood was used to investigate the effect of different fluids on flow inside the coronary artery. A code based on Reynolds Stress Navier Stokes (RANS) equations was written and implemented with the real boundary condition that was calculated based on MRI images. In order to improve the accuracy of the current numerical scheme, hex dominant mesh is utilized. When the inlet velocity increases to 0.5 m/s, velocity, wall shear stress, and pressure increase at the narrower parts.Keywords: CFD, simulation, OpenFOAM, heart
Procedia PDF Downloads 1463092 Pricing Techniques to Mitigate Recurring Congestion on Interstate Facilities Using Dynamic Feedback Assignment
Authors: Hatem Abou-Senna
Abstract:
Interstate 4 (I-4) is a primary east-west transportation corridor between Tampa and Daytona cities, serving commuters, commercial and recreational traffic. I-4 is known to have severe recurring congestion during peak hours. The congestion spans about 11 miles in the evening peak period in the central corridor area as it is considered the only non-tolled limited access facility connecting the Orlando Central Business District (CBD) and the tourist attractions area (Walt Disney World). Florida officials had been skeptical of tolling I-4 prior to the recent legislation, and the public through the media had been complaining about the excessive toll facilities in Central Florida. So, in search for plausible mitigation to the congestion on the I-4 corridor, this research is implemented to evaluate the effectiveness of different toll pricing alternatives that might divert traffic from I-4 to the toll facilities during the peak period. The network is composed of two main diverging limited access highways, freeway (I-4) and toll road (SR 417) in addition to two east-west parallel toll roads SR 408 and SR 528, intersecting the above-mentioned highways from both ends. I-4 and toll road SR 408 are the most frequently used route by commuters. SR-417 is a relatively uncongested toll road with 15 miles longer than I-4 and $5 tolls compared to no monetary cost on 1-4 for the same trip. The results of the calibrated Orlando PARAMICS network showed that percentages of route diversion vary from one route to another and depends primarily on the travel cost between specific origin-destination (O-D) pairs. Most drivers going from Disney (O1) or Lake Buena Vista (O2) to Lake Mary (D1) were found to have a high propensity towards using I-4, even when eliminating tolls and/or providing real-time information. However, a diversion from I-4 to SR 417 for these OD pairs occurred only in the cases of the incident and lane closure on I-4, due to the increase in delay and travel costs, and when information is provided to travelers. Furthermore, drivers that diverted from I-4 to SR 417 and SR 528 did not gain significant travel-time savings. This was attributed to the limited extra capacity of the alternative routes in the peak period and the longer traveling distance. When the remaining origin-destination pairs were analyzed, average travel time savings on I-4 ranged between 10 and 16% amounting to 10 minutes at the most with a 10% increase in the network average speed. High propensity of diversion on the network increased significantly when eliminating tolls on SR 417 and SR 528 while doubling the tolls on SR 408 along with the incident and lane closure scenarios on I-4 and with real-time information provided. The toll roads were found to be a viable alternative to I-4 for these specific OD pairs depending on the user perception of the toll cost which was reflected in their specific travel times. However, on the macroscopic level, it was concluded that route diversion through toll reduction or elimination on surrounding toll roads would only have a minimum impact on reducing I-4 congestion during the peak period.Keywords: congestion pricing, dynamic feedback assignment, microsimulation, paramics, route diversion
Procedia PDF Downloads 1783091 The Temporal Dimension of Narratives: A Construct of Qualitative Time
Authors: Ani Thomas
Abstract:
Every narrative is a temporal construct. Every narrative creates a qualitative experience of time for the viewer. The paper argues for the concept of a qualified time that emerges from the interaction between the narrative and the audience. The paper also challenges the conventional understanding of narrative time as either story time, real time or discourse time. Looking at narratives through the medium of Cinema, the study examines how narratives create and manipulate duration or durée, the qualitative experience of time as theorized by Henri Bergson. The paper further analyzes how Cinema and, by extension, narratives are nothing but Durée and the filmmaker, the artist of durée, who shape and manipulate the perception and emotions of the viewer through the manipulation and construction of durée. The paper draws on cinematic works to look at the techniques to demonstrate how filmmakers use, for example, editing, sound, compositional and production narratives etc., to create various modes of durée that challenge, amplify or unsettle the viewer’s sense of time. Bringing together the Viewer’s durée and exploring its interaction with the narrative construct, the paper explores the emergence of the new qualitative time, the narrative durée, that defines the audience experience.Keywords: cinema, time, bergson, duree
Procedia PDF Downloads 1463090 A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem
Authors: Watchara Songserm, Teeradej Wuttipornpun
Abstract:
This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.Keywords: capacitated MRP, genetic algorithm, linear programming, automotive industries, flow shop, application in industry
Procedia PDF Downloads 4883089 Augmenting History: Case Study Measuring Motivation of Students Using Augmented Reality Apps in History Classes
Authors: Kevin. S. Badni
Abstract:
Due to the rapid advances in the use of information technology and students’ familiarity with technology, learning styles in higher education are being reshaped. One of the technology developments that has gained considerable attention in recent years is Augmented Reality (AR), where technology is used to combine overlays of digital data on physical real-world settings. While AR is being heavily promoted for entertainment by mobile phone manufacturers, it has had little adoption in higher education due to the required upfront investment that an instructor needs to undertake in creating relevant AR applications. This paper discusses a case study that uses a low upfront development approach and examines the impact on generation-Z students’ motivation whilst studying design history over a four-semester period. Even though the upfront investment in creating the AR support was minimal, the results showed a noticeable increase in student motivation. The approach used in this paper can be easily transferred to other disciplines and other areas of design education.Keywords: augmented reality, history, motivation, technology
Procedia PDF Downloads 1633088 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions
Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad
Abstract:
This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.Keywords: fragility analysis, seismic performance, tunnel lining, vulnerability
Procedia PDF Downloads 3133087 Seismic Performance of Reinforced Concrete Frame Structure Based on Plastic Rotation
Authors: Kahil Amar, Meziani Faroudja, Khelil Nacim
Abstract:
The principal objective of this study is the evaluation of the seismic performance of reinforced concrete frame structures, taking into account of the behavior laws, reflecting the real behavior of materials, using CASTEM2000 software. A finite element model used is based in modified Takeda model with Timoshenko elements for columns and beams. This model is validated on a Vecchio experimental reinforced concrete (RC) frame model. Then, a study focused on the behavior of a RC frame with three-level and three-story in order to visualize the positioning the plastic hinge (plastic rotation), determined from the curvature distribution along the elements. The results obtained show that the beams of the 1st and 2nd level developed a very large plastic rotations, or these rotations exceed the values corresponding to CP (Collapse prevention with cp qCP = 0.02 rad), against those developed at the 3rd level, are between IO and LS (Immediate occupancy and life Safety with qIO = 0.005 rad and rad qLS = 0.01 respectively), so the beams of first and second levels submit a very significant damage.Keywords: seismic performance, performance level, pushover analysis, plastic rotation, plastic hinge
Procedia PDF Downloads 128