Search results for: data integrity and privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25441

Search results for: data integrity and privacy

25081 Modeling Of The Random Impingement Erosion Due To The Impact Of The Solid Particles

Authors: Siamack A. Shirazi, Farzin Darihaki

Abstract:

Solid particles could be found in many multiphase flows, including transport pipelines and pipe fittings. Such particles interact with the pipe material and cause erosion which threats the integrity of the system. Therefore, predicting the erosion rate is an important factor in the design and the monitor of such systems. Mechanistic models can provide reliable predictions for many conditions while demanding only relatively low computational cost. Mechanistic models utilize a representative particle trajectory to predict the impact characteristics of the majority of the particle impacts that cause maximum erosion rate in the domain. The erosion caused by particle impacts is not only due to the direct impacts but also random impingements. In the present study, an alternative model has been introduced to describe the erosion due to random impingement of particles. The present model provides a realistic trend for erosion with changes in the particle size and particle Stokes number. The present model is examined against the experimental data and CFD simulation results and indicates better agreement with the data incomparison to the available models in the literature.

Keywords: erosion, mechanistic modeling, particles, multiphase flow, gas-liquid-solid

Procedia PDF Downloads 166
25080 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 70
25079 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response

Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul

Abstract:

The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.

Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response

Procedia PDF Downloads 663
25078 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.

Keywords: ISA, neural network, Brain Float-16, DUT

Procedia PDF Downloads 87
25077 Mapping a Data Governance Framework to the Continuum of Care in the Active Assisted Living Context

Authors: Gaya Bin Noon, Thoko Hanjahanja-Phiri, Laura Xavier Fadrique, Plinio Pelegrini Morita, Hélène Vaillancourt, Jennifer Teague, Tania Donovska

Abstract:

Active Assisted Living (AAL) refers to systems designed to improve the quality of life, aid in independence, and create healthier lifestyles for care recipients. As the population ages, there is a pressing need for non-intrusive, continuous, adaptable, and reliable health monitoring tools to support aging in place. AAL has great potential to support these efforts with the wide variety of solutions currently available, but insufficient efforts have been made to address concerns arising from the integration of AAL into care. The purpose of this research was to (1) explore the integration of AAL technologies and data into the clinical pathway, and (2) map data access and governance for AAL technology in order to develop standards for use by policy-makers, technology manufacturers, and developers of smart communities for seniors. This was done through four successive research phases: (1) literature search to explore existing work in this area and identify lessons learned; (2) modeling of the continuum of care; (3) adapting a framework for data governance into the AAL context; and (4) interviews with stakeholders to explore the applicability of previous work. Opportunities for standards found in these research phases included a need for greater consistency in language and technology requirements, better role definition regarding who can access and who is responsible for taking action based on the gathered data, and understanding of the privacy-utility tradeoff inherent in using AAL technologies in care settings.

Keywords: active assisted living, aging in place, internet of things, standards

Procedia PDF Downloads 127
25076 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 359
25075 Groundwater Monitoring Using a Community: Science Approach

Authors: Shobha Kumari Yadav, Yubaraj Satyal, Ajaya Dixit

Abstract:

In addressing groundwater depletion, it is important to develop evidence base so to be used in assessing the state of its degradation. Groundwater data is limited compared to meteorological data, which impedes the groundwater use and management plan. Monitoring of groundwater levels provides information base to assess the condition of aquifers, their responses to water extraction, land-use change, and climatic variability. It is important to maintain a network of spatially distributed, long-term monitoring wells to support groundwater management plan. Monitoring involving local community is a cost effective approach that generates real time data to effectively manage groundwater use. This paper presents the relationship between rainfall and spring flow, which are the main source of freshwater for drinking, household consumptions and agriculture in hills of Nepal. The supply and withdrawal of water from springs depends upon local hydrology and the meteorological characteristics- such as rainfall, evapotranspiration and interflow. The study offers evidence of the use of scientific method and community based initiative for managing groundwater and springshed. The approach presents a method to replicate similar initiative in other parts of the country for maintaining integrity of springs.

Keywords: citizen science, groundwater, water resource management, Nepal

Procedia PDF Downloads 195
25074 The Effects of Gas Metal Arc Welding Parameters on the Corrosion Behaviour of Austenitic Stainless Steel Immersed in Aqueous Sodium Hydroxide

Authors: I. M. B. Omiogbemi, D. S. Yawas, I. M. Dagwa, F. G. Okibe

Abstract:

This work present the effects of some gas metal arc welding parameters on the corrosion behavior of austenitic stainless steel, exposed to 0.5M sodium hydroxide at ambient temperatures (298K) using conventional weight loss determination, together with surface morphology evaluation by scanning electron microscopy and the application of factorial design of experiment to determine welding conditions which enhance the integrity of the welded stainless steel. The welding variables evaluated include speed, voltage and current. Different samples of the welded stainless steels were immersed in the corrosion environment for 8, 16, 24, 32 and 40 days and weight loss determined. From the results, it was found that increase in welding current and speed at constant voltage gave the optimum performance of the austenitic stainless steel in the environment. At a of speed 40cm/min, 110Amp current and voltage of 230 volt the welded stainless steel showed only a 0.0015mg loss in weight after 40 days. Pit-like openings were observed on the surface of the metals indicating corrosion but were minimal at the optimum conditions. It was concluded from the research that relatively high welding speed and current at a constant voltage gives a good welded austenitic stainless steel with better integrity.

Keywords: welding, current, speed, austenitic stainless steel, sodium hydroxide

Procedia PDF Downloads 313
25073 Comparing Different Frequency Ground Penetrating Radar Antennas for Tunnel Health Assessment

Authors: Can Mungan, Gokhan Kilic

Abstract:

Structural engineers and tunnel owners have good reason to attach importance to the assessment and inspection of tunnels. Regular inspection is necessary to maintain and monitor the health of the structure not only at the present time but throughout its life cycle. Detection of flaws within the structure, such as corrosion and the formation of cracks within the internal elements of the structure, can go a long way to ensuring that the structure maintains its integrity over the course of its life. Other issues that may be detected earlier through regular assessment include tunnel surface delamination and the corrosion of the rebar. One advantage of new technology such as the ground penetrating radar (GPR) is the early detection of imperfections. This study will aim to discuss and present the effectiveness of GPR as a tool for assessing the structural integrity of the heavily used tunnel. GPR is used with various antennae in frequency and application method (2 GHz and 500 MHz GPR antennae). The paper will attempt to produce a greater understanding of structural defects and identify the correct tool for such purposes. Conquest View with 3D scanning capabilities was involved throughout the analysis, reporting, and interpretation of the results. This study will illustrate GPR mapping and its effectiveness in providing information of value when it comes to rebar position (lower and upper reinforcement). It will also show how such techniques can detect structural features that would otherwise remain unseen, as well as moisture ingress.

Keywords: tunnel, GPR, health monitoring, moisture ingress, rebar position

Procedia PDF Downloads 115
25072 Applying Biculturalism in Studying Tourism Host Community Cultural Integrity and Individual Member Stress

Authors: Shawn P. Daly

Abstract:

Communities heavily engaged in the tourism industry discover their values intersect, meld, and conflict with those of visitors. Maintaining cultural integrity in the face of powerful external pressures causes stress among society members. This effect represents a less studied aspect of sustainable tourism. The present paper brings a perspective unique to the tourism literature: biculturalism. The grounded theories, coherent hypotheses, and validated constructs and indicators of biculturalism represent a sound base from which to consider sociocultural issues in sustainable tourism. Five models describe the psychological state of individuals operating at cultural crossroads: assimilation (joining the new culture), acculturation (grasping the new culture but remaining of the original culture), alternation (varying behavior to cultural context), multicultural (maintaining distinct cultures), and fusion (blending cultures). These five processes divide into two units of analysis (individual and society), permitting research questions at levels important for considering sociocultural sustainability. Acculturation modelling has morphed into dual processes of acculturation (new culture adaptation) and enculturation (original culture adaptation). This dichotomy divides sustainability research questions into human impacts from assimilation (acquiring new culture, throwing away original), separation (rejecting new culture, keeping original), integration (acquiring new culture, keeping original), and marginalization (rejecting new culture, throwing away original). Biculturalism is often cast in terms of its emotional, behavioral, and cognitive dimensions. Required cultural adjustments and varying levels of cultural competence lead to physical, psychological, and emotional outcomes, including depression, lowered life satisfaction and self-esteem, headaches, and back pain—or enhanced career success, social skills, and life styles. Numerous studies provide empirical scales and research hypotheses for sustainability research into tourism’s causality and effect on local well-being. One key issue in applying biculturalism to sustainability scholarship concerns identification and specification of the alternative new culture contacting local culture. Evidence exists for tourism industry, universal tourist, and location/event-specific tourist culture. The biculturalism paradigm holds promise for researchers examining evolving cultural identity and integrity in response to mass tourism. In particular, confirmed constructs and scales simplify operationalization of tourism sustainability studies in terms of human impact and adjustment.

Keywords: biculturalism, cultural integrity, psychological and sociocultural adjustment, tourist culture

Procedia PDF Downloads 401
25071 An Investigation on Orthopedic Rehabilitation by Avoiding Thermal Necrosis

Authors: R. V. Dahibhate, A. B. Deoghare, P. M. Padole

Abstract:

Maintaining natural integrity of biosystem is paramount significant for orthopedic surgeon while performing surgery. Restoration is challenging task to rehabilitate trauma patient. Drilling is an inevitable procedure to fix implants. The task leads to rise in temperature at the contact site which intends to thermal necrosis. A precise monitoring can avoid thermal necrosis. To accomplish it, data acquiring instrument is integrated with the drill bit. To contemplate it, electronic feedback system is developed. It not only measures temperature without any physical contact in between measuring device and target but also visualizes the site and monitors correct movement of tool path. In the current research work an infrared thermometer data acquisition system is used which monitors variation in temperature at the drilling site and a camera captured movement of drill bit advancement. The result is presented in graphical form which represents variations in temperature, drill rotation and time. A feedback system helps in keeping drill speed in threshold limit.

Keywords: thermal necrosis, infrared thermometer, drilling tool, feedback system

Procedia PDF Downloads 227
25070 Study on High Performance Fiber Reinforced Concrete (HPFRC) Beams on Subjected to Cyclic Loading

Authors: A. Siva, K. Bala Subramanian, Kinson Prabu

Abstract:

Concrete is widely used construction materials all over the world. Now a day’s fibers are used in this construction due to its advantages like increase in stiffness, energy absorption, ductility and load carrying capacity. The fiber used in the concrete to increases the structural integrity of the member. It is one of the emerging techniques used in the construction industry. In this paper, the effective utilization of high-performance fiber reinforced concrete (HPFRC) beams has been experimental investigated. The experimental investigation has been conducted on different steel fibers (Hooked, Crimpled, and Hybrid) under cyclic loading. The behaviour of HPFRC beams is compared with the conventional beams. Totally four numbers of specimens were cast with different content of fiber concrete and compared conventional concrete. The fibers are added to the concrete by base volume replacement of concrete. The silica fume and superplasticizers were used to modify the properties of concrete. Single point loading was carried out for all the specimens, and the beam specimens were subjected to cyclic loading. The load-deflection behaviour of fibers is compared with the conventional concrete. The ultimate load carrying capacity, energy absorption and ductility of hybrid fiber reinforced concrete is higher than the conventional concrete by 5% to 10%.

Keywords: cyclic loading, ductility, high performance fiber reinforced concrete, structural integrity

Procedia PDF Downloads 267
25069 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud

Authors: N. Nalini, Bhanu Prakash Gopularam

Abstract:

The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.

Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping

Procedia PDF Downloads 379
25068 Functional Instruction Set Simulator (ISS) of a Neural Network (NN) IP with Native BF-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A Functional Model to mimic the functional correctness of a Neural Network Compute Accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of gcc compilers to BF-16 datatype, which we addressed with a native BF-16 generator integrated to our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex Neural Network Accelerator design by proposing a Functional Model-based scoreboard or Software model using SystemC. The proposed Functional Model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT bringing up micro-steps of execution.

Keywords: ISA (instruction set architecture), NN (neural network), TLM (transaction-level modeling), GEMM (general matrix multiplication)

Procedia PDF Downloads 79
25067 The Internet of Things Ecosystem: Survey of the Current Landscape, Identity Relationship Management, Multifactor Authentication Mechanisms, and Underlying Protocols

Authors: Nazli W. Hardy

Abstract:

A critical component in the Internet of Things (IoT) ecosystem is the need for secure and appropriate transmission, processing, and storage of the data. Our current forms of authentication, and identity and access management do not suffice because they are not designed to service cohesive, integrated, interconnected devices, and service applications. The seemingly endless opportunities of IoT are in fact circumscribed on multiple levels by concerns such as trust, privacy, security, loss of control, and related issues. This paper considers multi-factor authentication (MFA) mechanisms and cohesive identity relationship management (IRM) standards. It also surveys messaging protocols that are appropriate for the IoT ecosystem.

Keywords: identity relation management, multifactor authentication, protocols, survey of internet of things ecosystem

Procedia PDF Downloads 346
25066 Reducing the Computational Cost of a Two-way Coupling CFD-FEA Model via a Multi-scale Approach for Fire Determination

Authors: Daniel Martin Fellows, Sean P. Walton, Jennifer Thompson, Oubay Hassan, Kevin Tinkham, Ella Quigley

Abstract:

Structural integrity for cladding products is a key performance parameter, especially concerning fire performance. Cladding products such as PIR-based sandwich panels are tested rigorously, in line with industrial standards. Physical fire tests are necessary to ensure the customer's safety but can give little information about critical behaviours that can help develop new materials. Numerical modelling is a tool that can help investigate a fire's behaviour further by replicating the fire test. However, fire is an interdisciplinary problem as it is a chemical reaction that behaves fluidly and impacts structural integrity. An analysis using Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) is needed to capture all aspects of a fire performance test. One method is a two-way coupling analysis that imports the updated changes in thermal data, due to the fire's behaviour, to the FEA solver in a series of iterations. In light of our recent work with Tata Steel U.K using a two-way coupling methodology to determine the fire performance, it has been shown that a program called FDS-2-Abaqus can make predictions of a BS 476 -22 furnace test with a degree of accuracy. The test demonstrated the fire performance of Tata Steel U.K Trisomet product, a Polyisocyanurate (PIR) based sandwich panel used for cladding. Previous works demonstrated the limitations of the current version of the program, the main limitation being the computational cost of modelling three Trisomet panels, totalling an area of 9 . The computational cost increases substantially, with the intention to scale up to an LPS 1181-1 test, which includes a total panel surface area of 200 .The FDS-2-Abaqus program is developed further within this paper to overcome this obstacle and better accommodate Tata Steel U.K PIR sandwich panels. The new developments aim to reduce the computational cost and error margin compared to experimental data. One avenue explored is a multi-scale approach in the form of Reduced Order Modeling (ROM). The approach allows the user to include refined details of the sandwich panels, such as the overlapping joints, without a computationally costly mesh size.Comparative studies will be made between the new implementations and the previous study completed using the original FDS-2-ABAQUS program. Validation of the study will come from physical experiments in line with governing body standards such as BS 476 -22 and LPS 1181-1. The physical experimental data includes the panels' gas and surface temperatures and mechanical deformation. Conclusions are drawn, noting the new implementations' impact factors and discussing the reasonability for scaling up further to a whole warehouse.

Keywords: fire testing, numerical coupling, sandwich panels, thermo fluids

Procedia PDF Downloads 70
25065 Improving the Patient Guidance Satisfaction and Integrity of Patients Hospitalized in Iodine-131 Isolation Rooms

Authors: Yu Sin Syu

Abstract:

Objective: The study aimed to improve the patient guidance satisfaction of patients hospitalized in iodine-131 isolation rooms, as well as the patient guidance completion rate for such patients. Method: A patient care guidance checklist and patient care guidance satisfaction questionnaire were administered to 29 patients who had previously been hospitalized in iodine-131 isolation rooms. The evaluation was conducted on a one-on-one basis, and its results showed that the patients’ satisfaction with patient guidance was only 3.7 points and that the completion rate for the patient guidance performed by nurses was only 67%. Therefore, various solutions were implemented to create a more complete patient guidance framework for nurses, including the incorporation of regular care-related training in in-service education courses; the establishment of patient care guidance standards for patients in iodine-131 isolation rooms; the establishment of inpatient care standards and auditing processes for iodine-131 isolation rooms; the creation of an introductory handbook on ward environment; Invite other the care team the revision of iodine-131 health education brochures; the creation of visual cards and videos covering equipment operation procedures; and introduction of QR codes. Results: Following the implementation of the above measures, the overall satisfaction of patients hospitalized in iodine-131 isolation rooms increased from 3.7 points to 4.6 points, and the completion rate for patient guidance rose from 67% to 100%. Conclusion: Given the excellent results achieved in this study, it is hoped that this nursing project can serve as a benchmark for other relevant departments.

Keywords: admission care guidance, guidance satisfaction, integrity, Iodine131 isolation

Procedia PDF Downloads 123
25064 Radio Based Location Detection

Authors: M. Pallikonda Rajasekaran, J. Joshapath, Abhishek Prasad Shaw

Abstract:

Various techniques has been employed to find location such as GPS, GLONASS, Galileo, and Beidou (compass). This paper currently deals with finding location using the existing FM signals that operates between 88-108 MHz. The location can be determined based on the received signal strength of nearby existing FM stations by mapping the signal strength values using trilateration concept. Thus providing security to users data and maintains eco-friendly environment at zero installation cost as this technology already existing FM stations operating in commercial FM band 88-108 MHZ. Along with the signal strength based trilateration it also finds azimuthal angle of the transmitter by employing directional antenna like Yagi-Uda antenna at the receiver side.

Keywords: location, existing FM signals, received signal strength, trilateration, security, eco-friendly, direction, privacy, zero installation cost

Procedia PDF Downloads 514
25063 Abnormality Detection of Persons Living Alone Using Daily Life Patterns Obtained from Sensors

Authors: Ippei Kamihira, Takashi Nakajima, Taiyo Matsumura, Hikaru Miura, Takashi Ono

Abstract:

In this research, the goal was construction of a system by which multiple sensors were used to observe the daily life behavior of persons living alone (while respecting their privacy). Using this information to judge such conditions as a bad physical condition or falling in the home, etc., so that these abnormal conditions can be made known to relatives and third parties. The daily life patterns of persons living alone are expressed by the number of responses of sensors each time that a set time period has elapsed. By comparing data for the prior two weeks, it was possible to judge a situation as 'normal' when the person was in a good physical condition or as 'abnormal' when the person was in a bad physical condition.

Keywords: sensors, elderly living alone, abnormality detection, iifestyle habit

Procedia PDF Downloads 246
25062 Secure E-Pay System Using Steganography and Visual Cryptography

Authors: K. Suganya Devi, P. Srinivasan, M. P. Vaishnave, G. Arutperumjothi

Abstract:

Today’s internet world is highly prone to various online attacks, of which the most harmful attack is phishing. The attackers host the fake websites which are very similar and look alike. We propose an image based authentication using steganography and visual cryptography to prevent phishing. This paper presents a secure steganographic technique for true color (RGB) images and uses Discrete Cosine Transform to compress the images. The proposed method hides the secret data inside the cover image. The use of visual cryptography is to preserve the privacy of an image by decomposing the original image into two shares. Original image can be identified only when both qualified shares are simultaneously available. Individual share does not reveal the identity of the original image. Thus, the existence of the secret message is hard to be detected by the RS steganalysis.

Keywords: image security, random LSB, steganography, visual cryptography

Procedia PDF Downloads 325
25061 Protective Role of CoQ10 or L-Carnitine on the Integrity of the Myocardium in Doxorubicin Induced Toxicity

Authors: Gehan A. Hegazy, Hesham N. Mustafa, Sally A. El Awdan, Marawan AbdelBaset

Abstract:

Doxorubicin (DOX) is a chemotherapeutic agent used for the treatment of different cancers and its clinical usage is hindered by the oxidative injury-related cardiotoxicity. This work aims to declare if the harmful effects of DOX on the heart can be alleviated with the use of Coenzyme Q10 (CoQ10) or L-carnitine. The study was performed on seventy-two female Wistar albino rats divided into six groups, 12 animals each: Control group; DOX group (10 mg/kg); CoQ10 group (200 mg/kg); L-carnitine group (100 mg/kg); DOX + CoQ10 group; DOX + L-carnitine group. CoQ10 and L-carnitine treatment orally started five days before a single dose of 10 mg/kg DOX that injected intraperitoneally (IP) then the treatment continued for ten days. At the end of the study, serum biochemical parameters of cardiac damage, oxidative stress indices, and histopathological changes were investigated. CoQ10 or L-carnitine showed noticeable effects in improving cardiac functions evidenced reducing serum enzymes as serum interleukin-1 beta (IL-1), tumor necrosis factor alpha (TNF-), leptin, lactate dehydrogenase (LDH), Cardiotrophin-1, Troponin-I and Troponin-T. Also, alleviate oxidative stress, decrease of cardiac Malondialdehyde (MDA), Nitric oxide (NO) and restoring cardiac reduced glutathione levels to normal levels. Both corrected the cardiac alterations histologically and ultrastructurally. With visible improvements in -SMA, vimentin and eNOS immunohistochemical markers. CoQ10 or L-carnitine supplementation improves the functional and structural integrity of the myocardium.

Keywords: CoQ10, doxorubicin, L-Carnitine, cardiotoxicity

Procedia PDF Downloads 164
25060 Multi-Level Security Measures in Cloud Computing

Authors: Shobha G. Ranjan

Abstract:

Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.

Keywords: cloud computing, cloud security, integrity, multi-tenancy, security

Procedia PDF Downloads 496
25059 A Review: Artificial Intelligence (AI) Driven User Access Management and Identity Governance

Authors: Rupan Preet Kaur

Abstract:

This article reviewed the potential of artificial intelligence in the field of identity and access management (IAM) and identity governance and administration (IGA), the most critical pillars of any organization. The power of leveraging AI in the most complex and huge user base environment was outlined by simplifying and streamlining the user access approvals and re-certifications without any impact on the user productivity and at the same time strengthening the overall compliance of IAM landscape. Certain challenges encountered in the current state were detailed where majority of organizations are still lacking maturity in the data integrity aspect. Finally, this paper concluded that within the realm of possibility, users and application owners can reap the benefits of unified approach provided by AI to improve the user experience, improve overall efficiency, and strengthen the risk posture.

Keywords: artificial intelligence, machine learning, user access review, access approval

Procedia PDF Downloads 87
25058 Predicting Daily Patient Hospital Visits Using Machine Learning

Authors: Shreya Goyal

Abstract:

The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.

Keywords: machine learning, SVM, HIPAA, data

Procedia PDF Downloads 62
25057 Formal Development of Electronic Identity Card System Using Event-B

Authors: Tomokazu Nagata, Jawid Ahmad Baktash

Abstract:

The goal of this paper is to explore the use of formal methods for Electronic Identity Card System. Nowadays, one of the core research directions in a constantly growing distributed environment is the improvement of the communication process. The responsibility for proper verification becomes crucial. Formal methods can play an essential role in the development and testing of systems. The thesis presents two different methodologies for assessing correctness. Our first approach employs abstract interpretation techniques for creating a trace based model for Electronic Identity Card System. The model was used for building a semi decidable procedure for verifying the system model. We also developed the code for the eID System and can cover three parts login to system sending of Acknowledgment from user side, receiving of all information from server side and log out from system. The new concepts of impasse and spawned sessions that we introduced led our research to original statements about the intruder’s knowledge and eID system coding with respect to secrecy. Furthermore, we demonstrated that there is a bound on the number of sessions needed for the analysis of System.Electronic identity (eID) cards promise to supply a universal, nation-wide mechanism for user authentication. Most European countries have started to deploy eID for government and private sector applications. Are government-issued electronic ID cards the proper way to authenticate users of online services? We use the eID project as a showcase to discuss eID from an application perspective. The new eID card has interesting design features, it is contact-less, it aims to protect people’s privacy to the extent possible, and it supports cryptographically strong mutual authentication between users and services. Privacy features include support for pseudonymous authentication and per service controlled access to individual data items. The article discusses key concepts, the eID infrastructure, observed and expected problems, and open questions. The core technology seems ready for prime time and government projects deploy it to the masses. But application issues may hamper eID adoption for online applications.

Keywords: eID, event-B, Pro-B, formal method, message passing

Procedia PDF Downloads 228
25056 Survey of Access Controls in Cloud Computing

Authors: Monirah Alkathiry, Hanan Aljarwan

Abstract:

Cloud computing is one of the most significant technologies that the world deals with, in different sectors with different purposes and capabilities. The cloud faces various challenges in securing data from unauthorized access or modification. Consequently, security risks and levels have greatly increased. Therefore, cloud service providers (CSPs) and users need secure mechanisms that ensure that data are kept secret and safe from any disclosures or exploits. For this reason, CSPs need a number of techniques and technologies to manage and secure access to the cloud services to achieve security goals, such as confidentiality, integrity, identity access management (IAM), etc. Therefore, this paper will review and explore various access controls implemented in a cloud environment that achieve different security purposes. The methodology followed in this survey was conducting an assessment, evaluation, and comparison between those access controls mechanisms and technologies based on different factors, such as the security goals it achieves, usability, and cost-effectiveness. This assessment resulted in the fact that the technology used in an access control affects the security goals it achieves as well as there is no one access control method that achieves all security goals. Consequently, such a comparison would help decision-makers to choose properly the access controls that meet their requirements.

Keywords: access controls, cloud computing, confidentiality, identity and access management

Procedia PDF Downloads 128
25055 Defect Classification of Hydrogen Fuel Pressure Vessels using Deep Learning

Authors: Dongju Kim, Youngjoo Suh, Hyojin Kim, Gyeongyeong Kim

Abstract:

Acoustic Emission Testing (AET) is widely used to test the structural integrity of an operational hydrogen storage container, and clustering algorithms are frequently used in pattern recognition methods to interpret AET results. However, the interpretation of AET results can vary from user to user as the tuning of the relevant parameters relies on the user's experience and knowledge of AET. Therefore, it is necessary to use a deep learning model to identify patterns in acoustic emission (AE) signal data that can be used to classify defects instead. In this paper, a deep learning-based model for classifying the types of defects in hydrogen storage tanks, using AE sensor waveforms, is proposed. As hydrogen storage tanks are commonly constructed using carbon fiber reinforced polymer composite (CFRP), a defect classification dataset is collected through a tensile test on a specimen of CFRP with an AE sensor attached. The performance of the classification model, using one-dimensional convolutional neural network (1-D CNN) and synthetic minority oversampling technique (SMOTE) data augmentation, achieved 91.09% accuracy for each defect. It is expected that the deep learning classification model in this paper, used with AET, will help in evaluating the operational safety of hydrogen storage containers.

Keywords: acoustic emission testing, carbon fiber reinforced polymer composite, one-dimensional convolutional neural network, smote data augmentation

Procedia PDF Downloads 87
25054 Efficient Reuse of Exome Sequencing Data for Copy Number Variation Callings

Authors: Chen Wang, Jared Evans, Yan Asmann

Abstract:

With the quick evolvement of next-generation sequencing techniques, whole-exome or exome-panel data have become a cost-effective way for detection of small exonic mutations, but there has been a growing desire to accurately detect copy number variations (CNVs) as well. In order to address this research and clinical needs, we developed a sequencing coverage pattern-based method not only for copy number detections, data integrity checks, CNV calling, and visualization reports. The developed methodologies include complete automation to increase usability, genome content-coverage bias correction, CNV segmentation, data quality reports, and publication quality images. Automatic identification and removal of poor quality outlier samples were made automatically. Multiple experimental batches were routinely detected and further reduced for a clean subset of samples before analysis. Algorithm improvements were also made to improve somatic CNV detection as well as germline CNV detection in trio family. Additionally, a set of utilities was included to facilitate users for producing CNV plots in focused genes of interest. We demonstrate the somatic CNV enhancements by accurately detecting CNVs in whole exome-wide data from the cancer genome atlas cancer samples and a lymphoma case study with paired tumor and normal samples. We also showed our efficient reuses of existing exome sequencing data, for improved germline CNV calling in a family of the trio from the phase-III study of 1000 Genome to detect CNVs with various modes of inheritance. The performance of the developed method is evaluated by comparing CNV calling results with results from other orthogonal copy number platforms. Through our case studies, reuses of exome sequencing data for calling CNVs have several noticeable functionalities, including a better quality control for exome sequencing data, improved joint analysis with single nucleotide variant calls, and novel genomic discovery of under-utilized existing whole exome and custom exome panel data.

Keywords: bioinformatics, computational genetics, copy number variations, data reuse, exome sequencing, next generation sequencing

Procedia PDF Downloads 251
25053 A Lightweight Blockchain: Enhancing Internet of Things Driven Smart Buildings Scalability and Access Control Using Intelligent Direct Acyclic Graph Architecture and Smart Contracts

Authors: Syed Irfan Raza Naqvi, Zheng Jiangbin, Ahmad Moshin, Pervez Akhter

Abstract:

Currently, the IoT system depends on a centralized client-servant architecture that causes various scalability and privacy vulnerabilities. Distributed ledger technology (DLT) introduces a set of opportunities for the IoT, which leads to practical ideas for existing components at all levels of existing architectures. Blockchain Technology (BCT) appears to be one approach to solving several IoT problems, like Bitcoin (BTC) and Ethereum, which offer multiple possibilities. Besides, IoTs are resource-constrained devices with insufficient capacity and computational overhead to process blockchain consensus mechanisms; the traditional BCT existing challenge for IoTs is poor scalability, energy efficiency, and transaction fees. IOTA is a distributed ledger based on Direct Acyclic Graph (DAG) that ensures M2M micro-transactions are free of charge. IOTA has the potential to address existing IoT-related difficulties such as infrastructure scalability, privacy and access control mechanisms. We proposed an architecture, SLDBI: A Scalable, lightweight DAG-based Blockchain Design for Intelligent IoT Systems, which adapts the DAG base Tangle and implements a lightweight message data model to address the IoT limitations. It enables the smooth integration of new IoT devices into a variety of apps. SLDBI enables comprehensive access control, energy efficiency, and scalability in IoT ecosystems by utilizing the Masked Authentication Message (MAM) protocol and the IOTA Smart Contract Protocol (ISCP). Furthermore, we suggest proof-of-work (PoW) computation on the full node in an energy-efficient way. Experiments have been carried out to show the capability of a tangle to achieve better scalability while maintaining energy efficiency. The findings show user access control management at granularity levels and ensure scale up to massive networks with thousands of IoT nodes, such as Smart Connected Buildings (SCBDs).

Keywords: blockchain, IOT, direct acyclic graphy, scalability, access control, architecture, smart contract, smart connected buildings

Procedia PDF Downloads 115
25052 Enhancing Rupture Pressure Prediction for Corroded Pipes Through Finite Element Optimization

Authors: Benkouiten Imene, Chabli Ouerdia, Boutoutaou Hamid, Kadri Nesrine, Bouledroua Omar

Abstract:

Algeria is actively enhancing gas productivity by augmenting the supply flow. However, this effort has led to increased internal pressure, posing a potential risk to the pipeline's integrity, particularly in the presence of corrosion defects. Sonatrach relies on a vast network of pipelines spanning 24,000 kilometers for the transportation of gas and oil. The aging of these pipelines raises the likelihood of corrosion both internally and externally, heightening the risk of ruptures. To address this issue, a comprehensive inspection is imperative, utilizing specialized scraping tools. These advanced tools furnish a detailed assessment of all pipeline defects. It is essential to recalculate the pressure parameters to safeguard the corroded pipeline's integrity while ensuring the continuity of production. In this context, Sonatrach employs symbolic pressure limit calculations, such as ASME B31G (2009) and the modified ASME B31G (2012). The aim of this study is to perform a comparative analysis of various limit pressure calculation methods documented in the literature, namely DNV RP F-101, SHELL, P-CORRC, NETTO, and CSA Z662. This comparative assessment will be based on a dataset comprising 329 burst tests published in the literature. Ultimately, we intend to introduce a novel approach grounded in the finite element method, employing ANSYS software.

Keywords: pipeline burst pressure, burst test, corrosion defect, corroded pipeline, finite element method

Procedia PDF Downloads 53