Search results for: real GDP
3118 Large-Scale Electroencephalogram Biometrics through Contrastive Learning
Authors: Mostafa ‘Neo’ Mohsenvand, Mohammad Rasool Izadi, Pattie Maes
Abstract:
EEG-based biometrics (user identification) has been explored on small datasets of no more than 157 subjects. Here we show that the accuracy of modern supervised methods falls rapidly as the number of users increases to a few thousand. Moreover, supervised methods require a large amount of labeled data for training which limits their applications in real-world scenarios where acquiring data for training should not take more than a few minutes. We show that using contrastive learning for pre-training, it is possible to maintain high accuracy on a dataset of 2130 subjects while only using a fraction of labels. We compare 5 different self-supervised tasks for pre-training of the encoder where our proposed method achieves the accuracy of 96.4%, improving the baseline supervised models by 22.75% and the competing self-supervised model by 3.93%. We also study the effects of the length of the signal and the number of channels on the accuracy of the user-identification models. Our results reveal that signals from temporal and frontal channels contain more identifying features compared to other channels.Keywords: brainprint, contrastive learning, electroencephalo-gram, self-supervised learning, user identification
Procedia PDF Downloads 1573117 Interactive Solutions for the Multi-Objective Capacitated Transportation Problem with Mixed Constraints under Fuzziness
Authors: Aquil Ahmed, Srikant Gupta, Irfan Ali
Abstract:
In this paper, we study a multi-objective capacitated transportation problem (MOCTP) with mixed constraints. This paper is comprised of the modelling and optimisation of an MOCTP in a fuzzy environment in which some goals are fractional and some are linear. In real life application of the fuzzy goal programming (FGP) problem with multiple objectives, it is difficult for the decision maker(s) to determine the goal value of each objective precisely as the goal values are imprecise or uncertain. Also, we developed the concept of linearization of fractional goal for solving the MOCTP. In this paper, imprecision of the parameter is handled by the concept of fuzzy set theory by considering these parameters as a trapezoidal fuzzy number. α-cut approach is used to get the crisp value of the parameters. Numerical examples are used to illustrate the method for solving MOCTP.Keywords: capacitated transportation problem, multi objective linear programming, multi-objective fractional programming, fuzzy goal programming, fuzzy sets, trapezoidal fuzzy number
Procedia PDF Downloads 4343116 The Role of Artificial Intelligence in Concrete Constructions
Authors: Ardalan Tofighi Soleimandarabi
Abstract:
Artificial intelligence has revolutionized the concrete construction industry and improved processes by increasing efficiency, accuracy, and sustainability. This article examines the applications of artificial intelligence in predicting the compressive strength of concrete, optimizing mixing plans, and improving structural health monitoring systems. Artificial intelligence-based models, such as artificial neural networks (ANN) and combined machine learning techniques, have shown better performance than traditional methods in predicting concrete properties. In addition, artificial intelligence systems have made it possible to improve quality control and real-time monitoring of structures, which helps in preventive maintenance and increases the life of infrastructure. Also, the use of artificial intelligence plays an effective role in sustainable construction by optimizing material consumption and reducing waste. Although the implementation of artificial intelligence is associated with challenges such as high initial costs and the need for specialized training, it will create a smarter, more sustainable, and more affordable future for concrete structures.Keywords: artificial intelligence, concrete construction, compressive strength prediction, structural health monitoring, stability
Procedia PDF Downloads 153115 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data
Authors: Hyun-Woo Cho
Abstract:
It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring
Procedia PDF Downloads 2443114 Strategic Role of Fintechs in Evolving Financial Functions and Enhancing Corporate Resilience amid Economic Crises
Authors: Ghizlane Barzi, Zineb Bamousse
Abstract:
In an increasingly volatile global economic context characterized by recurring crises, the financial function of companies is called upon to play a strategic role not only in resource management but also in organizational resilience. The emergence of financial technologies (fintech) offers innovative tools capable of transforming this function by enhancing the efficiency of financial processes and increasing companies' ability to adapt and overcome economic shocks. However, despite the rapid rise of fintechs and their growing adoption by companies, there remain uncertainties regarding the real impact of these innovations on the financial resilience of organizations. Indeed, how do fintech-driven innovations transform the financial function, and to what extent does this transformation contribute to strengthening the financial resilience of companies in the face of contemporary crises? This research aims to explore these questions by examining the interrelationships between the financial function, fintech innovations, and corporate resilience, in order to identify optimization levers that could be adopted for better financial risk management.Keywords: finance, financial function, fintech, resilience, innovation
Procedia PDF Downloads 263113 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning
Procedia PDF Downloads 4233112 Against the Philosophical-Scientific Racial Project of Biologizing Race
Authors: Anthony F. Peressini
Abstract:
The concept of race has recently come prominently back into discussion in the context of medicine and medical science, along with renewed effort to biologize racial concepts. This paper argues that this renewed effort to biologize race by way of medicine and population genetics fail on their own terms, and more importantly, that the philosophical project of biologizing race ought to be recognized for what it is—a retrograde racial project—and abandoned. There is clear agreement that standard racial categories and concepts cannot be grounded in the old way of racial naturalism, which understand race as a real, interest-independent biological/metaphysical category in which its members share “physical, moral, intellectual, and cultural characteristics.” But equally clear is the very real and pervasive presence of racial concepts in individual and collective consciousness and behavior, and so it remains a pressing area in which to seek deeper understanding. Recent philosophical work has endeavored to reconcile these two observations by developing a “thin” conception of race, grounded in scientific concepts but without the moral and metaphysical content. Such “thin,” science-based analyses take the “commonsense” or “folk” sense of race as it functions in contemporary society as the starting point for their philosophic-scientific projects to biologize racial concepts. A “philosophic-scientific analysis” is a special case of the cornerstone of analytic philosophy: a conceptual analysis. That is, a rendering of a concept into the more perspicuous concepts that constitute it. Thus a philosophic-scientific account of a concept is an attempt to work out an analysis of a concept that makes use of empirical science's insights to ground, legitimate and explicate the target concept in terms of clearer concepts informed by empirical results. The focus in this paper is on three recent philosophic-scientific cases for retaining “race” that all share this general analytic schema, but that make use of “medical necessity,” population genetics, and human genetic clustering, respectively. After arguing that each of these three approaches suffers from internal difficulties, the paper considers the general analytic schema employed by such biologizations of race. While such endeavors are inevitably prefaced with the disclaimer that the theory to follow is non-essentialist and non-racialist, the case will be made that such efforts are not neutral scientific or philosophical projects but rather are what sociologists call a racial project, that is, one of many competing efforts that conjoin a representation of what race means to specific efforts to determine social and institutional arrangements of power, resources, authority, etc. Accordingly, philosophic-scientific biologizations of race, since they begin from and condition their analyses on “folk” conceptions, cannot pretend to be “prior to” other disciplinary insights, nor to transcend the social-political dynamics involved in formulating theories of race. As a result, such traditional philosophical efforts can be seen to be disciplinarily parochial and to address only a caricature of a large and important human problem—and thereby further contributing to the unfortunate isolation of philosophical thinking about race from other disciplines.Keywords: population genetics, ontology of race, race-based medicine, racial formation theory, racial projects, racism, social construction
Procedia PDF Downloads 2733111 Effect of the Ratio, Weight, Treatment of Loofah Fiber on the Mechanical Properties of the Composite: Loofah Fiber Resin
Authors: F. Siahmed, A. Lounis, L. Faghi
Abstract:
The aim of this work is to study mechanical properties of composites based on fiber natural. This material has attracted attention of the scientific community for its mechanical properties, its moderate cost and its specification as regards the protection of environment. In this study the loofah part of the family of the natural fiber has been used for these significant mechanical properties. The fiber has porous structure, which facilitates the impregnation of the resin through these pores. The matrix used in this study is the type of unsaturated polyester. This resin was chosen for its resistance to long term.The work involves: -The chemical treatment of the fibers of loofah by NaOH solution (5%) -The realization of the composite resin / fiber loofah; The preparation of samples for testing -The tensile tests and bending -The observation of facies rupture by scanning electron microscopy The results obtained allow us to observe that the values of Young's modulus and tensile strength in tension is high and open up real prospects. The improvement in mechanical properties has been obtained for the two-layer composite fiber with 7.5% (by weight).Keywords: loofah fiber, mechanical properties, composite, loofah fiber resin
Procedia PDF Downloads 4473110 Optimal Reactive Power Dispatch under Various Contingency Conditions Using Whale Optimization Algorithm
Authors: Khaled Ben Oualid Medani, Samir Sayah
Abstract:
The Optimal Reactive Power Dispatch (ORPD) problem has been solved and analysed usually in the normal conditions. However, network collapses appear in contingency conditions. In this paper, ORPD under several contingencies is presented using the proposed method WOA. To ensure viability of the power system in contingency conditions, several critical cases are simulated in order to prevent and prepare the power system to face such situations. The results obtained are carried out in IEEE 30 bus test system for the solution of ORPD problem in which control of bus voltages, tap position of transformers and reactive power sources are involved. Moreover, another method, namely, Particle Swarm Optimization with Time Varying Acceleration Coefficient (PSO-TVAC) has been compared with the proposed technique. Simulation results indicate that the proposed WOA gives remarkable solution in terms of effectiveness in case of outages.Keywords: optimal reactive power dispatch, power system analysis, real power loss minimization, contingency condition, metaheuristic technique, whale optimization algorithm
Procedia PDF Downloads 1213109 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title
Authors: Gangmin Li, Fan Yang
Abstract:
Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting
Procedia PDF Downloads 553108 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format
Authors: Maryam Fallahpoor, Biswajeet Pradhan
Abstract:
Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format
Procedia PDF Downloads 883107 Understanding the Importance of Participation in the City Planning Process and Its Influencing Factors
Authors: Louis Nwachi
Abstract:
Urban planning systems in most countries still rely on expert-driven, top-down technocratic plan-making processes rather than a public and people-led process. This paper set out to evaluate the need for public participation in the plan-making process and to highlight the factors that affect public participation in the plan-making process. In doing this, it adopted a qualitative approach based on document review and interviews taken from real-world phenomena. A case study strategy using the Metropolitan Area of Abuja, the capital of Nigeria, as the study sample was used in carrying out the research. The research finds that participation is an important tool in the plan-making process and that public engagement in the process contributes to the identification of key urban issues that are unique to the specific local areas, thereby contributing to the establishment of priorities and, in turn, to the mobilization of resources to meet the identified needs. It also finds that the development of a participation model by city authorities encourages public engagement and helps to develop trust between those in authority and the different key stakeholder groups involved in the plan-making process.Keywords: plan-making, participation, urban planning, city
Procedia PDF Downloads 1013106 Securing Health Monitoring in Internet of Things with Blockchain-Based Proxy Re-Encryption
Authors: Jerlin George, R. Chitra
Abstract:
The devices with sensors that can monitor your temperature, heart rate, and other vital signs and link to the internet, known as the Internet of Things (IoT), have completely transformed the way we control health. Providing real-time health data, these sensors improve diagnostics and treatment outcomes. Security and privacy matters when IoT comes into play in healthcare. Cyberattacks on centralized database systems are also a problem. To solve these challenges, the study uses blockchain technology coupled with proxy re-encryption to secure health data. ThingSpeak IoT cloud analyzes the collected data and turns them into blockchain transactions which are safely kept on the DriveHQ cloud. Transparency and data integrity are ensured by blockchain, and secure data sharing among authorized users is made possible by proxy re-encryption. This results in a health monitoring system that preserves the accuracy and confidentiality of data while reducing the safety risks of IoT-driven healthcare applications.Keywords: internet of things, healthcare, sensors, electronic health records, blockchain, proxy re-encryption, data privacy, data security
Procedia PDF Downloads 183105 Investigation the Effect of Velocity Inlet and Carrying Fluid on the Flow inside Coronary Artery
Authors: Mohammadreza Nezamirad, Nasim Sabetpour, Azadeh Yazdi, Amirmasoud Hamedi
Abstract:
In this study OpenFOAM 4.4.2 was used to investigate flow inside the coronary artery of the heart. This step is the first step of our future project, which is to include conjugate heat transfer of the heart with three main coronary arteries. Three different velocities were used as inlet boundary conditions to see the effect of velocity increase on velocity, pressure, and wall shear of the coronary artery. Also, three different fluids, namely the University of Wisconsin solution, gelatin, and blood was used to investigate the effect of different fluids on flow inside the coronary artery. A code based on Reynolds Stress Navier Stokes (RANS) equations was written and implemented with the real boundary condition that was calculated based on MRI images. In order to improve the accuracy of the current numerical scheme, hex dominant mesh is utilized. When the inlet velocity increases to 0.5 m/s, velocity, wall shear stress, and pressure increase at the narrower parts.Keywords: CFD, simulation, OpenFOAM, heart
Procedia PDF Downloads 1493104 Pricing Techniques to Mitigate Recurring Congestion on Interstate Facilities Using Dynamic Feedback Assignment
Authors: Hatem Abou-Senna
Abstract:
Interstate 4 (I-4) is a primary east-west transportation corridor between Tampa and Daytona cities, serving commuters, commercial and recreational traffic. I-4 is known to have severe recurring congestion during peak hours. The congestion spans about 11 miles in the evening peak period in the central corridor area as it is considered the only non-tolled limited access facility connecting the Orlando Central Business District (CBD) and the tourist attractions area (Walt Disney World). Florida officials had been skeptical of tolling I-4 prior to the recent legislation, and the public through the media had been complaining about the excessive toll facilities in Central Florida. So, in search for plausible mitigation to the congestion on the I-4 corridor, this research is implemented to evaluate the effectiveness of different toll pricing alternatives that might divert traffic from I-4 to the toll facilities during the peak period. The network is composed of two main diverging limited access highways, freeway (I-4) and toll road (SR 417) in addition to two east-west parallel toll roads SR 408 and SR 528, intersecting the above-mentioned highways from both ends. I-4 and toll road SR 408 are the most frequently used route by commuters. SR-417 is a relatively uncongested toll road with 15 miles longer than I-4 and $5 tolls compared to no monetary cost on 1-4 for the same trip. The results of the calibrated Orlando PARAMICS network showed that percentages of route diversion vary from one route to another and depends primarily on the travel cost between specific origin-destination (O-D) pairs. Most drivers going from Disney (O1) or Lake Buena Vista (O2) to Lake Mary (D1) were found to have a high propensity towards using I-4, even when eliminating tolls and/or providing real-time information. However, a diversion from I-4 to SR 417 for these OD pairs occurred only in the cases of the incident and lane closure on I-4, due to the increase in delay and travel costs, and when information is provided to travelers. Furthermore, drivers that diverted from I-4 to SR 417 and SR 528 did not gain significant travel-time savings. This was attributed to the limited extra capacity of the alternative routes in the peak period and the longer traveling distance. When the remaining origin-destination pairs were analyzed, average travel time savings on I-4 ranged between 10 and 16% amounting to 10 minutes at the most with a 10% increase in the network average speed. High propensity of diversion on the network increased significantly when eliminating tolls on SR 417 and SR 528 while doubling the tolls on SR 408 along with the incident and lane closure scenarios on I-4 and with real-time information provided. The toll roads were found to be a viable alternative to I-4 for these specific OD pairs depending on the user perception of the toll cost which was reflected in their specific travel times. However, on the macroscopic level, it was concluded that route diversion through toll reduction or elimination on surrounding toll roads would only have a minimum impact on reducing I-4 congestion during the peak period.Keywords: congestion pricing, dynamic feedback assignment, microsimulation, paramics, route diversion
Procedia PDF Downloads 1783103 The Temporal Dimension of Narratives: A Construct of Qualitative Time
Authors: Ani Thomas
Abstract:
Every narrative is a temporal construct. Every narrative creates a qualitative experience of time for the viewer. The paper argues for the concept of a qualified time that emerges from the interaction between the narrative and the audience. The paper also challenges the conventional understanding of narrative time as either story time, real time or discourse time. Looking at narratives through the medium of Cinema, the study examines how narratives create and manipulate duration or durée, the qualitative experience of time as theorized by Henri Bergson. The paper further analyzes how Cinema and, by extension, narratives are nothing but Durée and the filmmaker, the artist of durée, who shape and manipulate the perception and emotions of the viewer through the manipulation and construction of durée. The paper draws on cinematic works to look at the techniques to demonstrate how filmmakers use, for example, editing, sound, compositional and production narratives etc., to create various modes of durée that challenge, amplify or unsettle the viewer’s sense of time. Bringing together the Viewer’s durée and exploring its interaction with the narrative construct, the paper explores the emergence of the new qualitative time, the narrative durée, that defines the audience experience.Keywords: cinema, time, bergson, duree
Procedia PDF Downloads 1483102 A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem
Authors: Watchara Songserm, Teeradej Wuttipornpun
Abstract:
This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.Keywords: capacitated MRP, genetic algorithm, linear programming, automotive industries, flow shop, application in industry
Procedia PDF Downloads 4903101 Augmenting History: Case Study Measuring Motivation of Students Using Augmented Reality Apps in History Classes
Authors: Kevin. S. Badni
Abstract:
Due to the rapid advances in the use of information technology and students’ familiarity with technology, learning styles in higher education are being reshaped. One of the technology developments that has gained considerable attention in recent years is Augmented Reality (AR), where technology is used to combine overlays of digital data on physical real-world settings. While AR is being heavily promoted for entertainment by mobile phone manufacturers, it has had little adoption in higher education due to the required upfront investment that an instructor needs to undertake in creating relevant AR applications. This paper discusses a case study that uses a low upfront development approach and examines the impact on generation-Z students’ motivation whilst studying design history over a four-semester period. Even though the upfront investment in creating the AR support was minimal, the results showed a noticeable increase in student motivation. The approach used in this paper can be easily transferred to other disciplines and other areas of design education.Keywords: augmented reality, history, motivation, technology
Procedia PDF Downloads 1663100 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions
Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad
Abstract:
This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.Keywords: fragility analysis, seismic performance, tunnel lining, vulnerability
Procedia PDF Downloads 3153099 Seismic Performance of Reinforced Concrete Frame Structure Based on Plastic Rotation
Authors: Kahil Amar, Meziani Faroudja, Khelil Nacim
Abstract:
The principal objective of this study is the evaluation of the seismic performance of reinforced concrete frame structures, taking into account of the behavior laws, reflecting the real behavior of materials, using CASTEM2000 software. A finite element model used is based in modified Takeda model with Timoshenko elements for columns and beams. This model is validated on a Vecchio experimental reinforced concrete (RC) frame model. Then, a study focused on the behavior of a RC frame with three-level and three-story in order to visualize the positioning the plastic hinge (plastic rotation), determined from the curvature distribution along the elements. The results obtained show that the beams of the 1st and 2nd level developed a very large plastic rotations, or these rotations exceed the values corresponding to CP (Collapse prevention with cp qCP = 0.02 rad), against those developed at the 3rd level, are between IO and LS (Immediate occupancy and life Safety with qIO = 0.005 rad and rad qLS = 0.01 respectively), so the beams of first and second levels submit a very significant damage.Keywords: seismic performance, performance level, pushover analysis, plastic rotation, plastic hinge
Procedia PDF Downloads 1303098 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge
Authors: M. F. Yilmaz, B. Ö. Çağlayan
Abstract:
Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures
Procedia PDF Downloads 3573097 Automatic Detection of Proliferative Cells in Immunohistochemically Images of Meningioma Using Fuzzy C-Means Clustering and HSV Color Space
Authors: Vahid Anari, Mina Bakhshi
Abstract:
Visual search and identification of immunohistochemically stained tissue of meningioma was performed manually in pathologic laboratories to detect and diagnose the cancers type of meningioma. This task is very tedious and time-consuming. Moreover, because of cell's complex nature, it still remains a challenging task to segment cells from its background and analyze them automatically. In this paper, we develop and test a computerized scheme that can automatically identify cells in microscopic images of meningioma and classify them into positive (proliferative) and negative (normal) cells. Dataset including 150 images are used to test the scheme. The scheme uses Fuzzy C-means algorithm as a color clustering method based on perceptually uniform hue, saturation, value (HSV) color space. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.Keywords: positive cell, color segmentation, HSV color space, immunohistochemistry, meningioma, thresholding, fuzzy c-means
Procedia PDF Downloads 2103096 Impact on the Results of Sub-Group Analysis on Performance of Recommender Systems
Authors: Ho Yeon Park, Kyoung-Jae Kim
Abstract:
The purpose of this study is to investigate whether friendship in social media can be an important factor in recommender system through social scientific analysis of friendship in popular social media such as Facebook and Twitter. For this purpose, this study analyzes data on friendship in real social media using component analysis and clique analysis among sub-group analysis in social network analysis. In this study, we propose an algorithm to reflect the results of sub-group analysis on the recommender system. The key to this algorithm is to ensure that recommendations from users in friendships are more likely to be reflected in recommendations from users. As a result of this study, outcomes of various subgroup analyzes were derived, and it was confirmed that the results were different from the results of the existing recommender system. Therefore, it is considered that the results of the subgroup analysis affect the recommendation performance of the system. Future research will attempt to generalize the results of the research through further analysis of various social data.Keywords: sub-group analysis, social media, social network analysis, recommender systems
Procedia PDF Downloads 3653095 Comparison of Linear Discriminant Analysis and Support Vector Machine Classifications for Electromyography Signals Acquired at Five Positions of Elbow Joint
Authors: Amna Khan, Zareena Kausar, Saad Malik
Abstract:
Bio Mechatronics has extended applications in the field of rehabilitation. It has been contributing since World War II in improving the applicability of prosthesis and assistive devices in real life scenarios. In this paper, classification accuracies have been compared for two classifiers against five positions of elbow. Electromyography (EMG) signals analysis have been acquired directly from skeletal muscles of human forearm for each of the three defined positions and at modified extreme positions of elbow flexion and extension using 8 electrode Myo armband sensor. Features were extracted from filtered EMG signals for each position. Performance of two classifiers, support vector machine (SVM) and linear discriminant analysis (LDA) has been compared by analyzing the classification accuracies. SVM illustrated classification accuracies between 90-96%, in contrast to 84-87% depicted by LDA for five defined positions of elbow keeping the number of samples and selected feature the same for both SVM and LDA.Keywords: classification accuracies, electromyography, linear discriminant analysis (LDA), Myo armband sensor, support vector machine (SVM)
Procedia PDF Downloads 3683094 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 1613093 3 Dimensions Finite Element Analysis of Tunnel-Pile Interaction Scenarios Using Abaqus Software
Authors: Haitham J. M. Odeh
Abstract:
This paper introduced an analysis of the effect of tunneling near pile foundations. Accomplished by three-dimensional finite element modeling. The numerical simulation is conducted using Abaqus finite element software. By examining different Tunnel-pile scenarios. The paper presents the tunnel induced pile responses, Such as pile settlement, pile internal forces, and the comments made on changing the vertical and transversal location of the tunnel related to the piles, the study contains two pile-supported structure cases, single and a group of piles. A comprehensive comparison between real case study results and numerical simulation is presented. The results of the analysis reveal the critical and safe location of tunnel construction and the positive effect of a group of piles existing instead of single piles. Also, demonstrates the changes in pile responses by changing the tunnel location.Keywords: pile responses, single pile, group of piles, pile-tunnel interaction
Procedia PDF Downloads 1443092 An Analytical Study of Small Unmanned Arial Vehicle Dynamic Stability Characteristics
Authors: Abdelhakam A. Noreldien, Sakhr B. Abudarag, Muslim S. Eltoum, Salih O. Osman
Abstract:
This paper presents an analytical study of Small Unmanned Aerial Vehicle (SUAV) dynamic stability derivatives. Simulating SUAV dynamics and analyzing its behavior at the earliest design stages is too important and more efficient design aspect. The approach suggested in this paper is using the wind tunnel experiment to collect the aerodynamic data and get the dynamic stability derivatives. AutoCAD Software was used to draw the case study (wildlife surveillance SUAV). The SUAV is scaled down to be 0.25% of the real SUAV dimensions and converted to a wind tunnel model. The model was tested in three different speeds for three different attitudes which are; pitch, roll and yaw. The wind tunnel results were then used to determine the case study stability derivative values, and hence it used to calculate the roots of the characteristic equation for both longitudinal and lateral motions. Finally, the characteristic equation roots were found and discussed in all possible cases.Keywords: model, simulating, SUAV, wind tunnel
Procedia PDF Downloads 3753091 Monocular 3D Person Tracking AIA Demographic Classification and Projective Image Processing
Authors: McClain Thiel
Abstract:
Object detection and localization has historically required two or more sensors due to the loss of information from 3D to 2D space, however, most surveillance systems currently in use in the real world only have one sensor per location. Generally, this consists of a single low-resolution camera positioned above the area under observation (mall, jewelry store, traffic camera). This is not sufficient for robust 3D tracking for applications such as security or more recent relevance, contract tracing. This paper proposes a lightweight system for 3D person tracking that requires no additional hardware, based on compressed object detection convolutional-nets, facial landmark detection, and projective geometry. This approach involves classifying the target into a demographic category and then making assumptions about the relative locations of facial landmarks from the demographic information, and from there using simple projective geometry and known constants to find the target's location in 3D space. Preliminary testing, although severely lacking, suggests reasonable success in 3D tracking under ideal conditions.Keywords: monocular distancing, computer vision, facial analysis, 3D localization
Procedia PDF Downloads 1393090 Study on Water Level Management Criteria of Reservoir Failure Alert System
Authors: B. Lee, B. H. Choi
Abstract:
The loss of safety for reservoirs brought about by climate change and facility aging leads to reservoir failures, which results in the loss of lives and property damage in downstream areas. Therefore, it is necessary to provide a reservoir failure alert system for downstream residents to detect the early signs of failure (with sensors) in real-time and perform safety management to prevent and minimize possible damage. 10 case studies were carried out to verify the water level management criteria of four levels (attention, caution, alert, serious). Peak changes in water level data were analysed. The results showed that ‘Caution’ and ‘Alert’ were closed to 33% and 66% of difference in level between flood water level and full water level. Therefore, it is adequate to use initial water level management criteria of reservoir failure alert system for the first year. Acknowledgment: This research was supported by a grant (2017-MPSS31-002) from 'Supporting Technology Development Program for Disaster Management' funded by the Ministry of the Interior and Safety(MOIS)Keywords: alert system, management criteria, reservoir failure, sensor
Procedia PDF Downloads 2013089 Termite Brick Temperature and Relative Humidity by Continuous Monitoring Technique
Authors: Khalid Abdullah Alshuhail, Syrif Junidi, Ideisan Abu-Abdoum, Abdulsalam Aldawoud
Abstract:
For the intention of reducing energy consumption, a proposed construction brick was made of imitation termite mound soil referred here as termite brick (TB). To calculate the thermal performance, a real case model was constructed by using this biomimetic brick for testing purposes. This paper aims at investigating the thermal performance of this brick during different climatic months. Its thermal behaviour was thoroughly studied over the course of four months by using continuous method (CMm). The main parameters were focused on temperature and relative humidity. It was found that the TB does not perform similarly in all four months and/or in all orientations. Each four-month model study was deeply analyzed. By using the CMm method, the model was also examined. The measuring period shows generally that internal temperature and internal humidity are higher in the roof within 2 degrees and lowest at north wall orientation. The relative humidity was also investigated systematically. The paper reveals more interesting findings.Keywords: building material, continious monitoring, orientation, wall, temprature
Procedia PDF Downloads 124