Search results for: data sensitivity
24327 Application of GPRS in Water Quality Monitoring System
Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan
Abstract:
Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.Keywords: multiparameter sensor, GPRS, visual basic software, RS232
Procedia PDF Downloads 41224326 Viscoelastic Modeling of Hot Mix Asphalt (HMA) under Repeated Loading by Using Finite Element Method
Authors: S. A. Tabatabaei, S. Aarabi
Abstract:
Predicting the hot mix asphalt (HMA) response and performance is a challenging task because of the subjectivity of HMA under the complex loading and environmental condition. The behavior of HMA is a function of temperature of loading and also shows the time and rate-dependent behavior directly affecting design criteria of mixture. Velocity of load passing make the time and rate. The viscoelasticity illustrates the reaction of HMA under loading and environmental conditions such as temperature and moisture effect. The behavior has direct effect on design criteria such as tensional strain and vertical deflection. In this paper, the computational framework for viscoelasticity and implementation in 3D dimensional HMA model is introduced to use in finite element method. The model was lied under various repeated loading conditions at constant temperature. The response of HMA viscoelastic behavior is investigated in loading condition under speed vehicle and sensitivity of behavior to the range of speed and compared to HMA which is supposed to have elastic behavior as in conventional design methods. The results show the importance of loading time pulse, unloading time and various speeds on design criteria. Also the importance of memory fading of material to storing the strain and stress due to repeated loading was shown. The model was simulated by ABAQUS finite element packageKeywords: viscoelasticity, finite element method, repeated loading, HMA
Procedia PDF Downloads 39824325 Decision Support System in Air Pollution Using Data Mining
Authors: E. Fathallahi Aghdam, V. Hosseini
Abstract:
Environmental pollution is not limited to a specific region or country; that is why sustainable development, as a necessary process for improvement, pays attention to issues such as destruction of natural resources, degradation of biological system, global pollution, and climate change in the world, especially in the developing countries. According to the World Health Organization, as a developing city, Tehran (capital of Iran) is one of the most polluted cities in the world in terms of air pollution. In this study, three pollutants including particulate matter less than 10 microns, nitrogen oxides, and sulfur dioxide were evaluated in Tehran using data mining techniques and through Crisp approach. The data from 21 air pollution measuring stations in different areas of Tehran were collected from 1999 to 2013. Commercial softwares Clementine was selected for this study. Tehran was divided into distinct clusters in terms of the mentioned pollutants using the software. As a data mining technique, clustering is usually used as a prologue for other analyses, therefore, the similarity of clusters was evaluated in this study through analyzing local conditions, traffic behavior, and industrial activities. In fact, the results of this research can support decision-making system, help managers improve the performance and decision making, and assist in urban studies.Keywords: data mining, clustering, air pollution, crisp approach
Procedia PDF Downloads 42724324 Management of Caverno-Venous Leakage: A Series of 133 Patients with Symptoms, Hemodynamic Workup, and Results of Surgery
Authors: Allaire Eric, Hauet Pascal, Floresco Jean, Beley Sebastien, Sussman Helene, Virag Ronald
Abstract:
Background: Caverno-venous leakage (CVL) is devastating, although barely known disease, the first cause of major physical impairment in men under 25, and responsible for 50% of resistances to phosphodiesterase 5-inhibitors (PDE5-I), affecting 30 to 40% of users in this medication class. In this condition, too early blood drainage from corpora cavernosa prevents penile rigidity and penetration during sexual intercourse. The role of conservative surgery in this disease remains controversial. Aim: Assess complications and results of combined open surgery and embolization for CVL. Method: Between June 2016 and September 2021, 133 consecutive patients underwent surgery in our institution for CVL, causing severe erectile dysfunction (ED) resistance to oral medical treatment. Procedures combined vein embolization and ligation with microsurgical techniques. We performed a pre-and post-operative clinical (Erection Harness Scale: EHS) hemodynamic evaluation by duplex sonography in all patients. Before surgery, the CVL network was visualized by computed tomography cavernography. Penile EMG was performed in case of diabetes or suspected other neurological conditions. All patients were optimized for hormonal status—data we prospectively recorded. Results: Clinical signs suggesting CVL were ED since age lower than 25, loss of erection when changing position, penile rigidity varying according to the position. Main complications were minor pulmonary embolism in 2 patients, one after airline travel, one with Factor V Leiden heterozygote mutation, one infection and three hematomas requiring reoperation, one decreased gland sensitivity lasting for more than one year. Mean pre-operative pharmacologic EHS was 2.37+/-0.64, mean pharmacologic post-operative EHS was 3.21+/-0.60, p<0.0001 (paired t-test). The mean EHS variation was 0.87+/-0.74. After surgery, 81.5% of patients had a pharmacologic EHS equal to or over 3, allowing for intercourse with penetration. Three patients (2.2%) experienced lower post-operative EHS. The main cause of failure was leakage from the deep dorsal aspect of the corpus cavernosa. In a 14 months follow-up, 83.2% of patients had a clinical EHS equal to or over 3, allowing for sexual intercourse with penetration, one-third of them without any medication. 5 patients had a penile implant after unsuccessful conservative surgery. Conclusion: Open surgery combined with embolization for CVL is an efficient approach to CVL causing severe erectile dysfunction.Keywords: erectile dysfunction, cavernovenous leakage, surgery, embolization, treatment, result, complications, penile duplex sonography
Procedia PDF Downloads 14924323 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm
Authors: Anuradha Chug, Sunali Gandhi
Abstract:
Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm
Procedia PDF Downloads 38024322 Introducing α-Oxoester (COBz) as a Protecting Group for Carbohydrates
Authors: Atul Kumar, Veeranjaneyulu Gannedi, Qazi Naveed Ahmed
Abstract:
Oligosaccharides, which are essential to all cellular organisms, play vital roles in cell recognition, signaling, and are involved in a broad range of biological processes. The chemical synthesis of carbohydrates represents a powerful tool to provide homogeneous glycans. In carbohydrate synthesis, the major concern is the orthogonal protection of hydroxyl groups that can be unmasked independently. Classical protecting groups include benzyl ethers (Bn), which are normally cleaved through hydrogenolysis or by means of metal reduction, and acetate (Ac), benzoate (Bz) or pivaloate esters, which are removed using base promoted hydrolysis. In present work a series of α-Oxoester (COBz) protected saccharides, with divergent base sensitivity profiles against benzoyl (Bz) and acetyl (Ac), were designed and KHSO₅/CH₃COCl in methanol was identified as an easy, mild, selective and efficient deprotecting reagent for their removal in the perspective of carbohydrate synthesis. Timely monitoring of later reagent was advantageous in establishing both sequential as well as simultaneous deprotecting of COBz, Bz, and Ac. The salient feature of our work is its ease to generate different acceptors using designed monosaccharides. In summary, we demonstrated α-Oxoester (COBz) as a new protecting group for carbohydrates and the application of this group for the synthesis of Glycosylphosphatidylinositol (GPI) anchor are in progress.Keywords: α-Oxoester, oligosaccharides, new protecting group, acceptor synthesis, glycosylation
Procedia PDF Downloads 15024321 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization
Authors: Agria Rhamdhan
Abstract:
WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.Keywords: forensics, triage, visualization, WhatsApp
Procedia PDF Downloads 16824320 Low Cost Webcam Camera and GNSS Integration for Updating Home Data Using AI Principles
Authors: Mohkammad Nur Cahyadi, Hepi Hapsari Handayani, Agus Budi Raharjo, Ronny Mardianto, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
PDAM (local water company) determines customer charges by considering the customer's building or house. Charges determination significantly affects PDAM income and customer costs because the PDAM applies a subsidy policy for customers classified as small households. Periodic updates are needed so that pricing is in line with the target. A thorough customer survey in Surabaya is needed to update customer building data. However, the survey that has been carried out so far has been by deploying officers to conduct one-by-one surveys for each PDAM customer. Surveys with this method require a lot of effort and cost. For this reason, this research offers a technology called moblie mapping, a mapping method that is more efficient in terms of time and cost. The use of this tool is also quite simple, where the device will be installed in the car so that it can record the surrounding buildings while the car is running. Mobile mapping technology generally uses lidar sensors equipped with GNSS, but this technology requires high costs. In overcoming this problem, this research develops low-cost mobile mapping technology using a webcam camera sensor added to the GNSS and IMU sensors. The camera used has specifications of 3MP with a resolution of 720 and a diagonal field of view of 78⁰. The principle of this invention is to integrate four camera sensors, a GNSS webcam, and GPS to acquire photo data, which is equipped with location data (latitude, longitude) and IMU (roll, pitch, yaw). This device is also equipped with a tripod and a vacuum cleaner to attach to the car's roof so it doesn't fall off while running. The output data from this technology will be analyzed with artificial intelligence to reduce similar data (Cosine Similarity) and then classify building types. Data reduction is used to eliminate similar data and maintain the image that displays the complete house so that it can be processed for later classification of buildings. The AI method used is transfer learning by utilizing a trained model named VGG-16. From the analysis of similarity data, it was found that the data reduction reached 50%. Then georeferencing is done using the Google Maps API to get address information according to the coordinates in the data. After that, geographic join is done to link survey data with customer data already owned by PDAM Surya Sembada Surabaya.Keywords: mobile mapping, GNSS, IMU, similarity, classification
Procedia PDF Downloads 8424319 High Performance of Direct Torque and Flux Control of a Double Stator Induction Motor Drive with a Fuzzy Stator Resistance Estimator
Authors: K. Kouzi
Abstract:
In order to have stable and high performance of direct torque and flux control (DTFC) of double star induction motor drive (DSIM), proper on-line adaptation of the stator resistance is very important. This is inevitably due to the variation of the stator resistance during operating conditions, which introduces error in estimated flux position and the magnitude of the stator flux. Error in the estimated stator flux deteriorates the performance of the DTFC drive. Also, the effect of error in estimation is very important especially at low speed. Due to this, our aim is to overcome the sensitivity of the DTFC to the stator resistance variation by proposing on-line fuzzy estimation stator resistance. The fuzzy estimation method is based on an on-line stator resistance correction through the variations of the stator current estimation error and its variations. The fuzzy logic controller gives the future stator resistance increment at the output. The main advantage of the suggested algorithm control is to avoid the drive instability that may occur in certain situations and ensure the tracking of the actual stator resistance. The validity of the technique and the improvement of the whole system performance are proved by the results.Keywords: direct torque control, dual stator induction motor, Fuzzy Logic estimation, stator resistance adaptation
Procedia PDF Downloads 32524318 An Investigation into the Views of Distant Science Education Students Regarding Teaching Laboratory Work Online
Authors: Abraham Motlhabane
Abstract:
This research analysed the written views of science education students regarding the teaching of laboratory work using the online mode. The research adopted the qualitative methodology. The qualitative research was aimed at investigating small and distinct groups normally regarded as a single-site study. Qualitative research was used to describe and analyze the phenomena from the student’s perspective. This means the research began with assumptions of the world view that use theoretical lenses of research problems inquiring into the meaning of individual students. The research was conducted with three groups of students studying for Postgraduate Certificate in Education, Bachelor of Education and honors Bachelor of Education respectively. In each of the study programmes, the science education module is compulsory. Five science education students from each study programme were purposively selected to participate in this research. Therefore, 15 students participated in the research. In order to analysis the data, the data were first printed and hard copies were used in the analysis. The data was read several times and key concepts and ideas were highlighted. Themes and patterns were identified to describe the data. Coding as a process of organising and sorting data was used. The findings of the study are very diverse; some students are in favour of online laboratory whereas other students argue that science can only be learnt through hands-on experimentation.Keywords: online learning, laboratory work, views, perceptions
Procedia PDF Downloads 14424317 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP
Procedia PDF Downloads 31624316 Mining Scientific Literature to Discover Potential Research Data Sources: An Exploratory Study in the Field of Haemato-Oncology
Authors: A. Anastasiou, K. S. Tingay
Abstract:
Background: Discovering suitable datasets is an important part of health research, particularly for projects working with clinical data from patients organized in cohorts (cohort data), but with the proliferation of so many national and international initiatives, it is becoming increasingly difficult for research teams to locate real world datasets that are most relevant to their project objectives. We present a method for identifying healthcare institutes in the European Union (EU) which may hold haemato-oncology (HO) data. A key enabler of this research was the bibInsight platform, a scientometric data management and analysis system developed by the authors at Swansea University. Method: A PubMed search was conducted using HO clinical terms taken from previous work. The resulting XML file was processed using the bibInsight platform, linking affiliations to the Global Research Identifier Database (GRID). GRID is an international, standardized list of institutions, including the city and country in which the institution exists, as well as a category of the main business type, e.g., Academic, Healthcare, Government, Company. Countries were limited to the 28 current EU members, and institute type to 'Healthcare'. An article was considered valid if at least one author was affiliated with an EU-based healthcare institute. Results: The PubMed search produced 21,310 articles, consisting of 9,885 distinct affiliations with correspondence in GRID. Of these articles, 760 were from EU countries, and 390 of these were healthcare institutes. One affiliation was excluded as being a veterinary hospital. Two EU countries did not have any publications in our analysis dataset. The results were analysed by country and by individual healthcare institute. Networks both within the EU and internationally show institutional collaborations, which may suggest a willingness to share data for research purposes. Geographical mapping can ensure that data has broad population coverage. Collaborations with industry or government may exclude healthcare institutes that may have embargos or additional costs associated with data access. Conclusions: Data reuse is becoming increasingly important both for ensuring the validity of results, and economy of available resources. The ability to identify potential, specific data sources from over twenty thousand articles in less than an hour could assist in improving knowledge of, and access to, data sources. As our method has not yet specified if these healthcare institutes are holding data, or merely publishing on that topic, future work will involve text mining of data-specific concordant terms to identify numbers of participants, demographics, study methodologies, and sub-topics of interest.Keywords: data reuse, data discovery, data linkage, journal articles, text mining
Procedia PDF Downloads 11524315 Immuno-field Effect Transistor Using Carbon Nanotubes Network – Based for Human Serum Albumin Highly Sensitive Detection
Authors: Muhamad Azuddin Hassan, Siti Shafura Karim, Ambri Mohamed, Iskandar Yahya
Abstract:
Human serum albumin plays a significant part in the physiological functions of the human body system (HSA).HSA level monitoring is critical for early detection of HSA-related illnesses. The goal of this study is to show that a field effect transistor (FET)-based immunosensor can assess HSA using high aspect ratio carbon nanotubes network (CNT) as a transducer. The CNT network were deposited using air brush technique, and the FET device was made using a shadow mask process. Field emission scanning electron microscopy and a current-voltage measurement system were used to examine the morphology and electrical properties of the CNT network, respectively. X-ray photoelectron spectroscopy and Fourier transform infrared spectroscopy were used to confirm the surface alteration of the CNT. The detection process is based on covalent binding interactions between an antibody and an HSA target, which resulted in a change in the manufactured biosensor's drain current (Id).In a linear range between 1 ng/ml and 10zg/ml, the biosensor has a high sensitivity of 0.826 mA (g/ml)-1 and a LOD value of 1.9zg/ml.HSA was also identified in a genuine serum despite interference from other biomolecules, demonstrating the CNT-FET immunosensor's ability to quantify HSA in a complex biological environment.Keywords: carbon nanotubes network, biosensor, human serum albumin
Procedia PDF Downloads 13724314 Using Data Mining Technique for Scholarship Disbursement
Authors: J. K. Alhassan, S. A. Lawal
Abstract:
This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.Keywords: classification, data mining, decision tree, scholarship
Procedia PDF Downloads 37624313 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 32324312 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Different Crops
Authors: M. M. Ali, Ahmed Al- Ani, Derek Eamus, Daniel K. Y. Tan
Abstract:
In this glasshouse study, we developed the new image-based non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. Plants were allowed to grow on nutrient media containing different P concentrations, i.e. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P as NaH2PO4). After 10 weeks of growth, plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. This data was further used in the linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using the image and morphological data. Our proposed non-destructive imaging method is precise in estimating P requirements of different crop species.Keywords: image-based techniques, leaf area, leaf P contents, linear discriminant analysis
Procedia PDF Downloads 38024311 Design of Visual Repository, Constraint and Process Modeling Tool Based on Eclipse Plug-Ins
Authors: Rushiraj Heshi, Smriti Bhandari
Abstract:
Master Data Management requires creation of Central repository, applying constraints on Repository and designing processes to manage data. Designing of Repository, constraints on repository and business processes is very tedious and time consuming task for large Enterprise. Hence Visual Repository, constraints and Process (Workflow) modeling is the most critical step in Master Data Management.In this paper, we realize a Visual Modeling tool for implementing Repositories, Constraints and Processes based on Eclipse Plugin using GMF/EMF which follows principles of Model Driven Engineering (MDE).Keywords: EMF, GMF, GEF, repository, constraint, process
Procedia PDF Downloads 49724310 Comparative Analysis of Photosynthetic and Antioxidative Responses of Two Species of Anabaena under Ni and As(III) Stress
Authors: Shivam Yadav, Neelam Atri
Abstract:
Cyanobacteria, the photosynthetic prokaryotes are indispensable components of paddy soil contribute substantially to the nitrogen economy however often appended with metal load. They are well known to play crucial roles in maintenance of soil fertility and rice productivity. Nickel is one such metal that plays a vital role in the cellular physiology, however at higher concentrations it exerts adverse effects. Arsenic is another toxic metalloid that negatively affects the cyanobacterial proliferation. However species-specific comparative responses under As and Ni is largely unknown. The present study focuses on the comparative effects of nickel (Ni2+) and arsenite (As(III)) on two diazotrophic cyanobacterial species (Anabaena doliolum and Anabaena sp. PCC7120) in terms of antioxidative aspects. Oxidative damage measured in terms of lipid peroxidation and peroxide content was significantly higher after As(III) than Ni treatment as compared to control. Similarly, all the studied enzymatic and non-enzymatic parameters of antioxidative defense system except glutathione reductase (GR) showed greater induction against As(III) than Ni. Moreover, integrating comparative analysis of all studied parameters also demonstrated interspecies variation in terms of stress adaptive strategies reflected through higher sensitivity of Anabaena doliolum over Anabaena PCC7120.Keywords: antioxidative system, arsenic, cyanobacteria, nickel
Procedia PDF Downloads 15424309 A Multidimensional Exploration of Narcissistic Personality Disorder Through Psycholinguistic Analysis and Neuroscientific Correlates
Authors: Dalia Elleuch
Abstract:
Narcissistic Personality Disorder (NPD) manifests as a personality disorder marked by inflated self-importance, heightened sensitivity to criticism, a lack of empathy, a preoccupation with appearance over substance, and features such as arrogance, grandiosity, a constant need for admiration, a tendency to exploit others, and an inclination towards demanding special treatment due to a sense of excessive entitlement (APA, 2013). This interdisciplinary study delves into NPD through the systematic synthesis of psycholinguistic analysis and neuroscientific correlates. The cognitive and emotional dimensions of NPD reveal linguistic patterns, including grandiosity, entitlement, and manipulative communication. Neuroscientific investigations reveal structural brain differences and alterations in functional connectivity, further explaining the neural underpinnings of social cognition deficits observed in individuals with NPD. Genetic predispositions and neurotransmitter imbalances add a layer of complexity to the understanding of NPD. The necessity for linguistic intervention in diagnosing and treating Narcissistic Personality Disorder is underscored by an interdisciplinary study that intricately synthesizes psycholinguistic analysis and neuroscientific correlates, offering a comprehensive understanding of NPD’s cognitive, emotional, and neural dimensions and paving the way for future practical, theoretical, and pedagogical approaches to address the complexities of this personality disorder.Keywords: Narcissistic Personality Disorder (NPD), psycholinguistic analysis, neuroscientific correlates, interpersonal dysfunction, cognitive empathy
Procedia PDF Downloads 6524308 The Admitting Hemogram as a Predictor for Severity and in-Hospital Mortality in Acute Pancreatitis
Authors: Florge Francis A. Sy
Abstract:
Acute pancreatitis (AP) is an inflammatory condition of the pancreas with local and systemic complications. Severe acute pancreatitis (SAP) has a higher mortality rate. Laboratory parameters like the neutrophil-to-lymphocyte ratio (NLR), red cell distribution width (RDW), and mean platelet volume (MPV) have been associated with SAP but with conflicting results. This study aims to determine the predictive value of these parameters on the severity and in-hospital mortality of AP. This retrospective, cross-sectional study was done in a private hospital in Cebu City, Philippines. One-hundred five patients were classified according to severity based on the modified Marshall scoring. The admitting hemogram, including the NLR, RDW, and MPV, was obtained from the complete blood count (CBC). Cut-off values for severity and in-hospital mortality were derived from the ROC. Association between NLR, RDW, and MPV with SAP and mortality were determined with a p-value of < 0.05 considered significant. The mean age for AP was 47.6 years, with 50.5% being male. Most had an unknown cause (49.5%), followed by a biliary cause (37.1%). Of the 105 patients, 23 patients had SAP, and 4 died. Older age, longer in-hospital duration, congestive heart failure, elevated creatinine, urea nitrogen, and white blood cell count were seen in SAP. The NLR was associated with in-hospital mortality using a cut-off of > 10.6 (OR 1.133, 95% CI, p-value 0.003) with 100% sensitivity, 70.3% specificity, 11.76% PPV and 100% NPV (AUC 0.855). The NLR was not associated with SAP. The RDW and MPV were not associated with SAP and mortality. The admitting NLR is, therefore, an easily accessible parameter that can predict in-hospital mortality in acute pancreatitis. Although the present study did not show an association of NLR with SAP nor RDW and MPV with both SAP and mortality, further studies are suggested to establish their clinical value.Keywords: acute pancreatitis, mean platelet volume, neutrophil-lymphocyte ratio, red cell distribution width
Procedia PDF Downloads 12324307 Cyclic Plastic Deformation of 20MN-MO-NI 55 Steel in Dynamic Strain Ageing Regime
Authors: Ashok Kumar, Sarita Sahu, H. N. Bar
Abstract:
Low cycle fatigue behavior of a ferritic, martensitic pressure vessel steel at dynamic strain ageing regime of 250°C to 280°C has been investigated. Dynamic strain ageing is a mechanism that has attracted interests of researchers due to its fascinating inexplicable repetitive nature for quite a long time. The interaction of dynamic strain ageing and cyclic plasticity has been studied from the mechanistic point of view. Dynamic strain ageing gives rise to identical serrated flow behavior in tensile and compressive halves of hysteresis loops and this has been found to gives rise to initial cyclic hardening followed by softening behavior, where as in non-DSA regime continuous cyclic softening has been found to be the dominant mechanism. An appreciable sensitivity towards nature of serrations has been observed due to degree of hardening of stable loop. The increase in degree of hardening with strain amplitude in the regime where only A type serrations are present and it decreases with strain amplitude where A+B type of serrations are present. Masing type of locus has been found in the behavior of metal at 280°C. Cyclic Stress Strain curve and Master curve has been constructed to decipher among the fatigue strength and ductility coefficients. Fractographic examinations have also shown a competition between progression of striations and secondary cracking.Keywords: dynamic strain ageing, hardening, low cycle fatigue, softening
Procedia PDF Downloads 30124306 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups
Authors: Lily Ingsrisawang, Tasanee Nacharoen
Abstract:
Introduction: The problems of unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many research papers found that the performance of existing classifier tends to be biased towards the majority class. The k -nearest neighbors’ nonparametric discriminant analysis is one method that was proposed for classifying unbalanced classes with good performance. Hence, the methods of discriminant analysis are of interest to us in investigating misclassification error rates for class-imbalanced data of three diabetes risk groups. Objective: The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification application of class-imbalanced data of diabetes risk groups. Methods: Data from a healthy project for 599 staffs in a government hospital in Bangkok were obtained for the classification problem. The staffs were diagnosed into one of three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data along with the variables; diabetes risk group, age, gender, cholesterol, and BMI was analyzed and bootstrapped up to 50 and 100 samples, 599 observations per sample, for additional estimation of misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples show non-normality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. In finding the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions with three choices of (0.90:0.05:0.05), (0.80: 0.10: 0.10) or (0.70, 0.15, 0.15). Results: The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k = 3 or k = 4 and the prior probabilities of {non-risk:risk:diabetic} as {0.90:0.05:0.05} or {0.80:0.10:0.10} gave the smallest error rate of misclassification. Conclusion: The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.Keywords: error rate, bootstrap, diabetes risk groups, k-nearest neighbors
Procedia PDF Downloads 43524305 The Mitigation Strategy Analysis of Kuosheng Nuclear Power Plant Spent Fuel Pool Using MELCOR2.1/SNAP
Authors: Y. Chiang, J. R. Wang, J. H. Yang, Y. S. Tseng, C. Shih, S. W. Chen
Abstract:
Kuosheng nuclear power plant (NPP) is a BWR/6 plant in Taiwan. There is more concern for the safety of Spent Fuel Pools (SFPs) in Taiwan after Fukushima event. In order to estimate the safety of Kuosheng NPP SFP, by using MELCOR2.1 and SNAP, the safety analysis of Kuosheng NPP SFP was performed combined with the mitigation strategy of NEI 06-12 report. There were several steps in this research. First, the Kuosheng NPP SFP models were established by MELCOR2.1/SNAP. Second, the Station Blackout (SBO) analysis of Kuosheng SFP was done by TRACE and MELCOR under the cooling system failure condition. The results showed that the calculations of MELCOR and TRACE were very similar in this case. Second, the mitigation strategy analysis was done with the MELCOR model by following the NEI 06-12 report. The results showed the effectiveness of NEI 06-12 strategy in Kuosheng NPP SFP. Finally, a sensitivity study of SFP quenching was done to check the differences of different water injection time and the phenomena during the quenching. The results showed that if the cladding temperature was over 1600 K, the water injection may have chance to cause the accident more severe with more hydrogen generation. It was because of the oxidation heat and the “Breakaway” effect of the zirconium-water reaction. An animation model built by SNAP was also shown in this study.Keywords: MELCOR, SNAP, spent fuel pool, quenching
Procedia PDF Downloads 35924304 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network
Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour
Abstract:
Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network
Procedia PDF Downloads 16924303 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach
Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka
Abstract:
Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.Keywords: welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank
Procedia PDF Downloads 16724302 On the Estimation of Crime Rate in the Southwest of Nigeria: Principal Component Analysis Approach
Authors: Kayode Balogun, Femi Ayoola
Abstract:
Crime is at alarming rate in this part of world and there are many factors that are contributing to this antisocietal behaviour both among the youths and old. In this work, principal component analysis (PCA) was used as a tool to reduce the dimensionality and to really know those variables that were crime prone in the study region. Data were collected on twenty-eight crime variables from National Bureau of Statistics (NBS) databank for a period of fifteen years, while retaining as much of the information as possible. We use PCA in this study to know the number of major variables and contributors to the crime in the Southwest Nigeria. The results of our analysis revealed that there were eight principal variables have been retained using the Scree plot and Loading plot which implies an eight-equation solution will be appropriate for the data. The eight components explained 93.81% of the total variation in the data set. We also found that the highest and commonly committed crimes in the Southwestern Nigeria were: Assault, Grievous Harm and Wounding, theft/stealing, burglary, house breaking, false pretence, unlawful arms possession and breach of public peace.Keywords: crime rates, data, Southwest Nigeria, principal component analysis, variables
Procedia PDF Downloads 44424301 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 35624300 Same-Day Detection Method of Salmonella Spp., Shigella Spp. and Listeria Monocytogenes with Fluorescence-Based Triplex Real-Time PCR
Authors: Ergun Sakalar, Kubra Bilgic
Abstract:
Faster detection and characterization of pathogens are the basis of the evoid from foodborne pathogens. Salmonella spp., Shigella spp. and Listeria monocytogenes are common foodborne bacteria that are among the most life-threatining. It is important to rapid and accurate detection of these pathogens to prevent food poisoning and outbreaks or to manage food chains. The present work promise to develop a sensitive, species specific and reliable PCR based detection system for simultaneous detection of Salmonella spp., Shigella spp. and Listeria monocytogenes. For this purpose, three genes were picked out, ompC for Salmonella spp., ipaH for Shigella spp. and hlyA for L. monocytogenes. After short pre-enrichment of milk was passed through a vacuum filter and bacterial DNA was exracted using commercially available kit GIDAGEN®(Turkey, İstanbul). Detection of amplicons was verified by examination of the melting temperature (Tm) that are 72° C, 78° C, 82° C for Salmonella spp., Shigella spp. and L. monocytogenes, respectively. The method specificity was checked against a group of bacteria strains, and also carried out sensitivity test resulting in under 10² CFU mL⁻¹ of milk for each bacteria strain. Our results show that the flourescence based triplex qPCR method can be used routinely to detect Salmonella spp., Shigella spp. and L. monocytogenes during the milk processing procedures in order to reduce cost, time of analysis and the risk of foodborne disease outbreaks.Keywords: evagreen, food-born bacteria, pathogen detection, real-time pcr
Procedia PDF Downloads 24424299 Multilevel Gray Scale Image Encryption through 2D Cellular Automata
Authors: Rupali Bhardwaj
Abstract:
Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient
Procedia PDF Downloads 37724298 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks
Authors: Naveed Ghani, Samreen Javed
Abstract:
In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.Keywords: network worms, malware infection propagating malicious code, virus, security, VPN
Procedia PDF Downloads 358