Search results for: data integrity and privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25834

Search results for: data integrity and privacy

25474 Discursive (Re/De)Construction of Objectivity-Subjectivity: Critiquing Rape/Flesh Trade-Documentaries

Authors: Muhammed Shahriar Haque

Abstract:

As an offshoot of journalistic discourse, the documentary should be objective in nature without harbouring any preconceived notion to foster ulterior motifs. When it comes to a social issue like rape in South Asian countries, as media in recent times is inundated with this violent act in India, Pakistan, Myanmar, Bangladesh, how does one document it in terms of objectivity and subjectivity? The objective of this study is twofold: to document the history of documentaries, and to critically analyze South Asian rape/flesh trade-documentaries. The overall goal is to trace the (re/de)construction of objectivity-subjectivity in documentaries. This paper adopts a qualitative approach to documentarist discourse through the lens of critical discourse analysis (CDA). Data was gathered for 10 documentaries on the theme of rape and/or flesh trade from eight South Asian countries, predominantly the South Asian Association of Regional Cooperation (SAARC) region. The documentaries were primarily categorised by using three frameworks based on six modes, six subgenres, and four basic approaches of documentary. Subsequently, the findings were critiqued from CDA perspective. The outcome suggests that there a two schools of thoughts regarding documentaries. According to journalistic ethics, news and/or documentaries should be objective in orientation and focus on informing the audience and/common people. The empirical findings tend to challenge ethical parameters of objectivity. At times, it seems that journalistic discourse is discursively (re)constructed to give an augmented simulation of objectivity. Based on the findings it may be recommended that if documentaries steer away from empirical facts and indulge in poetic naivety, their credibility could be questioned. A research of this nature is significant as it raises questions with regard to ethical and moral conscience of documentary filmmakers. Furthermore, it looks at whether they uphold journalistic integrity or succumb to their bias, and thereby depict subjective views, which could be tainted with political and/or propagandist ulterior motifs.

Keywords: discursive (re/de)construction, documentaries, journalistic integrity, rape/flesh trade

Procedia PDF Downloads 154
25473 IoT Based Information Processing and Computing

Authors: Mannan Ahmad Rasheed, Sawera Kanwal, Mansoor Ahmad Rasheed

Abstract:

The Internet of Things (IoT) has revolutionized the way we collect and process information, making it possible to gather data from a wide range of connected devices and sensors. This has led to the development of IoT-based information processing and computing systems that are capable of handling large amounts of data in real time. This paper provides a comprehensive overview of the current state of IoT-based information processing and computing, as well as the key challenges and gaps that need to be addressed. This paper discusses the potential benefits of IoT-based information processing and computing, such as improved efficiency, enhanced decision-making, and cost savings. Despite the numerous benefits of IoT-based information processing and computing, several challenges need to be addressed to realize the full potential of these systems. These challenges include security and privacy concerns, interoperability issues, scalability and reliability of IoT devices, and the need for standardization and regulation of IoT technologies. Moreover, this paper identifies several gaps in the current research related to IoT-based information processing and computing. One major gap is the lack of a comprehensive framework for designing and implementing IoT-based information processing and computing systems.

Keywords: IoT, computing, information processing, Iot computing

Procedia PDF Downloads 188
25472 Data Transformations in Data Envelopment Analysis

Authors: Mansour Mohammadpour

Abstract:

Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.

Keywords: data transformation, data envelopment analysis, undesirable data, negative data

Procedia PDF Downloads 20
25471 An Integrated Approach to Handle Sour Gas Transportation Problems and Pipeline Failures

Authors: Venkata Madhusudana Rao Kapavarapu

Abstract:

The Intermediate Slug Catcher (ISC) facility was built to process nominally 234 MSCFD of export gas from the booster station on a day-to-day basis and to receive liquid slugs up to 1600 m³ (10,000 BBLS) in volume when the incoming 24” gas pipelines are pigged following upsets or production of non-dew-pointed gas from gathering centers. The maximum slug sizes expected are 812 m³ (5100 BBLS) in winter and 542 m³ (3400 BBLS) in summer after operating for a month or more at 100 MMSCFD of wet gas, being 60 MMSCFD of treated gas from the booster station, combined with 40 MMSCFD of untreated gas from gathering center. The water content is approximately 60% but may be higher if the line is not pigged for an extended period, owing to the relative volatility of the condensate compared to water. In addition to its primary function as a slug catcher, the ISC facility will receive pigged liquids from the upstream and downstream segments of the 14” condensate pipeline, returned liquids from the AGRP, pigged through the 8” pipeline, and blown-down fluids from the 14” condensate pipeline prior to maintenance. These fluids will be received in the condensate flash vessel or the condensate separator, depending on the specific operation, for the separation of water and condensate and settlement of solids scraped from the pipelines. Condensate meeting the colour and 200 ppm water specifications will be dispatched to the AGRP through the 14” pipeline, while off-spec material will be returned to BS-171 via the existing 10” condensate pipeline. When they are not in operation, the existing 24” export gas pipeline and the 10” condensate pipeline will be maintained under export gas pressure, ready for operation. The gas manifold area contains the interconnecting piping and valves needed to align the slug catcher with either of the 24” export gas pipelines from the booster station and to direct the gas to the downstream segment of either of these pipelines. The manifold enables the slug catcher to be bypassed if it needs to be maintained or if through-pigging of the gas pipelines is to be performed. All gas, whether bypassing the slug catcher or returning to the gas pipelines from it, passes through black powder filters to reduce the level of particulates in the stream. These items are connected to the closed drain vessel to drain the liquid collected. Condensate from the booster station is transported to AGRP through 14” condensate pipeline. The existing 10” condensate pipeline will be used as a standby and for utility functions such as returning condensate from AGRP to the ISC or booster station or for transporting off-spec fluids from the ISC back to booster station. The manifold contains block valves that allow the two condensate export lines to be segmented at the ISC, thus facilitating bi-directional flow independently in the upstream and downstream segments, which ensures complete pipeline integrity and facility integrity. Pipeline failures will be attended to with the latest technologies by remote techno plug techniques, and repair activities will be carried out as needed. Pipeline integrity will be evaluated with ili pigging to estimate the pipeline conditions.

Keywords: integrity, oil & gas, innovation, new technology

Procedia PDF Downloads 72
25470 Evaluation of Deformation for Deep Excavations in the Greater Vancouver Area Through Case Studies

Authors: Boris Kolev, Matt Kokan, Mohammad Deriszadeh, Farshid Bateni

Abstract:

Due to the increasing demand for real estate and the need for efficient land utilization in Greater Vancouver, developers have been increasingly considering the construction of high-rise structures with multiple below-grade parking. The temporary excavations required to allow for the construction of underground levels have recently reached up to 40 meters in depth. One of the challenges with deep excavations is the prediction of wall displacements and ground settlements due to their effect on the integrity of City utilities, infrastructure, and adjacent buildings. A large database of survey monitoring data has been collected for deep excavations in various soil conditions and shoring systems. The majority of the data collected is for tie-back anchors and shotcrete lagging systems. The data were categorized, analyzed and the results were evaluated to find a relationship between the most dominant parameters controlling the displacement, such as depth of excavation, soil properties, and the tie-back anchor loading and arrangement. For a select number of deep excavations, finite element modeling was considered for analyses. The lateral displacements from the simulation results were compared to the recorded survey monitoring data. The study concludes with a discussion and comparison of the available empirical and numerical modeling methodologies for evaluating lateral displacements in deep excavations.

Keywords: deep excavations, lateral displacements, numerical modeling, shoring walls, tieback anchors

Procedia PDF Downloads 182
25469 Study on the Pavement Structural Performance of Highways in the North China Region Based on Pavement Distress and Ground Penetrating Radar

Authors: Mingwei Yi, Liujie Guo, Zongjun Pan, Xiang Lin, Xiaoming Yi

Abstract:

With the rapid expansion of road construction mileage in China, the scale of road maintenance needs has concurrently escalated. As the service life of roads extends, the design of pavement repair and maintenance emerges as a crucial component in preserving the excellent performance of the pavement. The remaining service life of asphalt pavement structure is a vital parameter in the lifecycle maintenance design of asphalt pavements. Based on an analysis of pavement structural integrity, this study introduces a characterization and assessment of the remaining life of existing asphalt pavement structures. It proposes indicators such as the transverse crack spacing and the length of longitudinal cracks. The transverse crack spacing decreases with an increase in maintenance intervals and with the extended use of semi-rigid base layer structures, although this trend becomes less pronounced after maintenance intervals exceed 4 years. The length of longitudinal cracks increases with longer maintenance intervals, but this trend weakens after five years. This system can support the enhancement of standardization and scientific design in highway maintenance decision-making processes.

Keywords: structural integrity, highways, pavement evaluation, asphalt concrete pavement

Procedia PDF Downloads 70
25468 Study on Effect of Reverse Cyclic Loading on Fracture Resistance Curve of Equivalent Stress Gradient (ESG) Specimen

Authors: Jaegu Choi, Jae-Mean Koo, Chang-Sung Seok, Byungwoo Moon

Abstract:

Since massive earthquakes in the world have been reported recently, the safety of nuclear power plants for seismic loading has become a significant issue. Seismic loading is the reverse cyclic loading, consisting of repeated tensile and compression by longitudinal and transverse wave. Up to this time, the study on characteristics of fracture toughness under reverse cyclic loading has been unsatisfactory. Therefore, it is necessary to obtain the fracture toughness under reverse cyclic load for the integrity estimation of nuclear power plants under seismic load. Fracture resistance (J-R) curves, which are used for determination of fracture toughness or integrity estimation in terms of elastic-plastic fracture mechanics, can be derived by the fracture resistance test using single specimen technique. The objective of this paper is to study the effects of reverse cyclic loading on a fracture resistance curve of ESG specimen, having a similar stress gradient compared to the crack surface of the real pipe. For this, we carried out the fracture toughness test under the reverse cyclic loading, while changing incremental plastic displacement. Test results showed that the J-R curves were decreased with a decrease of the incremental plastic displacement.

Keywords: reverse cyclic loading, j-r curve, ESG specimen, incremental plastic displacement

Procedia PDF Downloads 388
25467 Encryption and Decryption of Nucleic Acid Using Deoxyribonucleic Acid Algorithm

Authors: Iftikhar A. Tayubi, Aabdulrahman Alsubhi, Abdullah Althrwi

Abstract:

The deoxyribonucleic acid text provides a single source of high-quality Cryptography about Deoxyribonucleic acid sequence for structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to encrypt and decrypt Deoxy Ribonucleic Acid sequence text. It includes complex, securing by using Algorithm to encrypt and decrypt Deoxy Ribonucleic Acid sequence. The utility of this Deoxy Ribonucleic Acid Sequence Text is that, it can provide a user-friendly interface for users to Encrypt and Decrypt store the information about Deoxy Ribonucleic Acid sequence. These interfaces created in this project will satisfy the demands of the scientific community by providing fully encrypt of Deoxy Ribonucleic Acid sequence during this website. We have adopted a methodology by using C# and Active Server Page.NET for programming which is smart and secure. Deoxy Ribonucleic Acid sequence text is a wonderful piece of equipment for encrypting large quantities of data, efficiently. The users can thus navigate from one encoding and store orange text, depending on the field for user’s interest. Algorithm classification allows a user to Protect the deoxy ribonucleic acid sequence from change, whether an alteration or error occurred during the Deoxy Ribonucleic Acid sequence data transfer. It will check the integrity of the Deoxy Ribonucleic Acid sequence data during the access.

Keywords: algorithm, ASP.NET, DNA, encrypt, decrypt

Procedia PDF Downloads 234
25466 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response

Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul

Abstract:

The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.

Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response

Procedia PDF Downloads 667
25465 The Effects of Gas Metal Arc Welding Parameters on the Corrosion Behaviour of Austenitic Stainless Steel Immersed in Aqueous Sodium Hydroxide

Authors: I. M. B. Omiogbemi, D. S. Yawas, I. M. Dagwa, F. G. Okibe

Abstract:

This work present the effects of some gas metal arc welding parameters on the corrosion behavior of austenitic stainless steel, exposed to 0.5M sodium hydroxide at ambient temperatures (298K) using conventional weight loss determination, together with surface morphology evaluation by scanning electron microscopy and the application of factorial design of experiment to determine welding conditions which enhance the integrity of the welded stainless steel. The welding variables evaluated include speed, voltage and current. Different samples of the welded stainless steels were immersed in the corrosion environment for 8, 16, 24, 32 and 40 days and weight loss determined. From the results, it was found that increase in welding current and speed at constant voltage gave the optimum performance of the austenitic stainless steel in the environment. At a of speed 40cm/min, 110Amp current and voltage of 230 volt the welded stainless steel showed only a 0.0015mg loss in weight after 40 days. Pit-like openings were observed on the surface of the metals indicating corrosion but were minimal at the optimum conditions. It was concluded from the research that relatively high welding speed and current at a constant voltage gives a good welded austenitic stainless steel with better integrity.

Keywords: welding, current, speed, austenitic stainless steel, sodium hydroxide

Procedia PDF Downloads 318
25464 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 90
25463 Modeling Of The Random Impingement Erosion Due To The Impact Of The Solid Particles

Authors: Siamack A. Shirazi, Farzin Darihaki

Abstract:

Solid particles could be found in many multiphase flows, including transport pipelines and pipe fittings. Such particles interact with the pipe material and cause erosion which threats the integrity of the system. Therefore, predicting the erosion rate is an important factor in the design and the monitor of such systems. Mechanistic models can provide reliable predictions for many conditions while demanding only relatively low computational cost. Mechanistic models utilize a representative particle trajectory to predict the impact characteristics of the majority of the particle impacts that cause maximum erosion rate in the domain. The erosion caused by particle impacts is not only due to the direct impacts but also random impingements. In the present study, an alternative model has been introduced to describe the erosion due to random impingement of particles. The present model provides a realistic trend for erosion with changes in the particle size and particle Stokes number. The present model is examined against the experimental data and CFD simulation results and indicates better agreement with the data incomparison to the available models in the literature.

Keywords: erosion, mechanistic modeling, particles, multiphase flow, gas-liquid-solid

Procedia PDF Downloads 169
25462 Mapping a Data Governance Framework to the Continuum of Care in the Active Assisted Living Context

Authors: Gaya Bin Noon, Thoko Hanjahanja-Phiri, Laura Xavier Fadrique, Plinio Pelegrini Morita, Hélène Vaillancourt, Jennifer Teague, Tania Donovska

Abstract:

Active Assisted Living (AAL) refers to systems designed to improve the quality of life, aid in independence, and create healthier lifestyles for care recipients. As the population ages, there is a pressing need for non-intrusive, continuous, adaptable, and reliable health monitoring tools to support aging in place. AAL has great potential to support these efforts with the wide variety of solutions currently available, but insufficient efforts have been made to address concerns arising from the integration of AAL into care. The purpose of this research was to (1) explore the integration of AAL technologies and data into the clinical pathway, and (2) map data access and governance for AAL technology in order to develop standards for use by policy-makers, technology manufacturers, and developers of smart communities for seniors. This was done through four successive research phases: (1) literature search to explore existing work in this area and identify lessons learned; (2) modeling of the continuum of care; (3) adapting a framework for data governance into the AAL context; and (4) interviews with stakeholders to explore the applicability of previous work. Opportunities for standards found in these research phases included a need for greater consistency in language and technology requirements, better role definition regarding who can access and who is responsible for taking action based on the gathered data, and understanding of the privacy-utility tradeoff inherent in using AAL technologies in care settings.

Keywords: active assisted living, aging in place, internet of things, standards

Procedia PDF Downloads 132
25461 Comparing Different Frequency Ground Penetrating Radar Antennas for Tunnel Health Assessment

Authors: Can Mungan, Gokhan Kilic

Abstract:

Structural engineers and tunnel owners have good reason to attach importance to the assessment and inspection of tunnels. Regular inspection is necessary to maintain and monitor the health of the structure not only at the present time but throughout its life cycle. Detection of flaws within the structure, such as corrosion and the formation of cracks within the internal elements of the structure, can go a long way to ensuring that the structure maintains its integrity over the course of its life. Other issues that may be detected earlier through regular assessment include tunnel surface delamination and the corrosion of the rebar. One advantage of new technology such as the ground penetrating radar (GPR) is the early detection of imperfections. This study will aim to discuss and present the effectiveness of GPR as a tool for assessing the structural integrity of the heavily used tunnel. GPR is used with various antennae in frequency and application method (2 GHz and 500 MHz GPR antennae). The paper will attempt to produce a greater understanding of structural defects and identify the correct tool for such purposes. Conquest View with 3D scanning capabilities was involved throughout the analysis, reporting, and interpretation of the results. This study will illustrate GPR mapping and its effectiveness in providing information of value when it comes to rebar position (lower and upper reinforcement). It will also show how such techniques can detect structural features that would otherwise remain unseen, as well as moisture ingress.

Keywords: tunnel, GPR, health monitoring, moisture ingress, rebar position

Procedia PDF Downloads 119
25460 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 76
25459 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.

Keywords: ISA, neural network, Brain Float-16, DUT

Procedia PDF Downloads 94
25458 Applying Biculturalism in Studying Tourism Host Community Cultural Integrity and Individual Member Stress

Authors: Shawn P. Daly

Abstract:

Communities heavily engaged in the tourism industry discover their values intersect, meld, and conflict with those of visitors. Maintaining cultural integrity in the face of powerful external pressures causes stress among society members. This effect represents a less studied aspect of sustainable tourism. The present paper brings a perspective unique to the tourism literature: biculturalism. The grounded theories, coherent hypotheses, and validated constructs and indicators of biculturalism represent a sound base from which to consider sociocultural issues in sustainable tourism. Five models describe the psychological state of individuals operating at cultural crossroads: assimilation (joining the new culture), acculturation (grasping the new culture but remaining of the original culture), alternation (varying behavior to cultural context), multicultural (maintaining distinct cultures), and fusion (blending cultures). These five processes divide into two units of analysis (individual and society), permitting research questions at levels important for considering sociocultural sustainability. Acculturation modelling has morphed into dual processes of acculturation (new culture adaptation) and enculturation (original culture adaptation). This dichotomy divides sustainability research questions into human impacts from assimilation (acquiring new culture, throwing away original), separation (rejecting new culture, keeping original), integration (acquiring new culture, keeping original), and marginalization (rejecting new culture, throwing away original). Biculturalism is often cast in terms of its emotional, behavioral, and cognitive dimensions. Required cultural adjustments and varying levels of cultural competence lead to physical, psychological, and emotional outcomes, including depression, lowered life satisfaction and self-esteem, headaches, and back pain—or enhanced career success, social skills, and life styles. Numerous studies provide empirical scales and research hypotheses for sustainability research into tourism’s causality and effect on local well-being. One key issue in applying biculturalism to sustainability scholarship concerns identification and specification of the alternative new culture contacting local culture. Evidence exists for tourism industry, universal tourist, and location/event-specific tourist culture. The biculturalism paradigm holds promise for researchers examining evolving cultural identity and integrity in response to mass tourism. In particular, confirmed constructs and scales simplify operationalization of tourism sustainability studies in terms of human impact and adjustment.

Keywords: biculturalism, cultural integrity, psychological and sociocultural adjustment, tourist culture

Procedia PDF Downloads 409
25457 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 364
25456 Groundwater Monitoring Using a Community: Science Approach

Authors: Shobha Kumari Yadav, Yubaraj Satyal, Ajaya Dixit

Abstract:

In addressing groundwater depletion, it is important to develop evidence base so to be used in assessing the state of its degradation. Groundwater data is limited compared to meteorological data, which impedes the groundwater use and management plan. Monitoring of groundwater levels provides information base to assess the condition of aquifers, their responses to water extraction, land-use change, and climatic variability. It is important to maintain a network of spatially distributed, long-term monitoring wells to support groundwater management plan. Monitoring involving local community is a cost effective approach that generates real time data to effectively manage groundwater use. This paper presents the relationship between rainfall and spring flow, which are the main source of freshwater for drinking, household consumptions and agriculture in hills of Nepal. The supply and withdrawal of water from springs depends upon local hydrology and the meteorological characteristics- such as rainfall, evapotranspiration and interflow. The study offers evidence of the use of scientific method and community based initiative for managing groundwater and springshed. The approach presents a method to replicate similar initiative in other parts of the country for maintaining integrity of springs.

Keywords: citizen science, groundwater, water resource management, Nepal

Procedia PDF Downloads 202
25455 Study on High Performance Fiber Reinforced Concrete (HPFRC) Beams on Subjected to Cyclic Loading

Authors: A. Siva, K. Bala Subramanian, Kinson Prabu

Abstract:

Concrete is widely used construction materials all over the world. Now a day’s fibers are used in this construction due to its advantages like increase in stiffness, energy absorption, ductility and load carrying capacity. The fiber used in the concrete to increases the structural integrity of the member. It is one of the emerging techniques used in the construction industry. In this paper, the effective utilization of high-performance fiber reinforced concrete (HPFRC) beams has been experimental investigated. The experimental investigation has been conducted on different steel fibers (Hooked, Crimpled, and Hybrid) under cyclic loading. The behaviour of HPFRC beams is compared with the conventional beams. Totally four numbers of specimens were cast with different content of fiber concrete and compared conventional concrete. The fibers are added to the concrete by base volume replacement of concrete. The silica fume and superplasticizers were used to modify the properties of concrete. Single point loading was carried out for all the specimens, and the beam specimens were subjected to cyclic loading. The load-deflection behaviour of fibers is compared with the conventional concrete. The ultimate load carrying capacity, energy absorption and ductility of hybrid fiber reinforced concrete is higher than the conventional concrete by 5% to 10%.

Keywords: cyclic loading, ductility, high performance fiber reinforced concrete, structural integrity

Procedia PDF Downloads 275
25454 An Investigation on Orthopedic Rehabilitation by Avoiding Thermal Necrosis

Authors: R. V. Dahibhate, A. B. Deoghare, P. M. Padole

Abstract:

Maintaining natural integrity of biosystem is paramount significant for orthopedic surgeon while performing surgery. Restoration is challenging task to rehabilitate trauma patient. Drilling is an inevitable procedure to fix implants. The task leads to rise in temperature at the contact site which intends to thermal necrosis. A precise monitoring can avoid thermal necrosis. To accomplish it, data acquiring instrument is integrated with the drill bit. To contemplate it, electronic feedback system is developed. It not only measures temperature without any physical contact in between measuring device and target but also visualizes the site and monitors correct movement of tool path. In the current research work an infrared thermometer data acquisition system is used which monitors variation in temperature at the drilling site and a camera captured movement of drill bit advancement. The result is presented in graphical form which represents variations in temperature, drill rotation and time. A feedback system helps in keeping drill speed in threshold limit.

Keywords: thermal necrosis, infrared thermometer, drilling tool, feedback system

Procedia PDF Downloads 231
25453 The Internet of Things Ecosystem: Survey of the Current Landscape, Identity Relationship Management, Multifactor Authentication Mechanisms, and Underlying Protocols

Authors: Nazli W. Hardy

Abstract:

A critical component in the Internet of Things (IoT) ecosystem is the need for secure and appropriate transmission, processing, and storage of the data. Our current forms of authentication, and identity and access management do not suffice because they are not designed to service cohesive, integrated, interconnected devices, and service applications. The seemingly endless opportunities of IoT are in fact circumscribed on multiple levels by concerns such as trust, privacy, security, loss of control, and related issues. This paper considers multi-factor authentication (MFA) mechanisms and cohesive identity relationship management (IRM) standards. It also surveys messaging protocols that are appropriate for the IoT ecosystem.

Keywords: identity relation management, multifactor authentication, protocols, survey of internet of things ecosystem

Procedia PDF Downloads 355
25452 Improving the Patient Guidance Satisfaction and Integrity of Patients Hospitalized in Iodine-131 Isolation Rooms

Authors: Yu Sin Syu

Abstract:

Objective: The study aimed to improve the patient guidance satisfaction of patients hospitalized in iodine-131 isolation rooms, as well as the patient guidance completion rate for such patients. Method: A patient care guidance checklist and patient care guidance satisfaction questionnaire were administered to 29 patients who had previously been hospitalized in iodine-131 isolation rooms. The evaluation was conducted on a one-on-one basis, and its results showed that the patients’ satisfaction with patient guidance was only 3.7 points and that the completion rate for the patient guidance performed by nurses was only 67%. Therefore, various solutions were implemented to create a more complete patient guidance framework for nurses, including the incorporation of regular care-related training in in-service education courses; the establishment of patient care guidance standards for patients in iodine-131 isolation rooms; the establishment of inpatient care standards and auditing processes for iodine-131 isolation rooms; the creation of an introductory handbook on ward environment; Invite other the care team the revision of iodine-131 health education brochures; the creation of visual cards and videos covering equipment operation procedures; and introduction of QR codes. Results: Following the implementation of the above measures, the overall satisfaction of patients hospitalized in iodine-131 isolation rooms increased from 3.7 points to 4.6 points, and the completion rate for patient guidance rose from 67% to 100%. Conclusion: Given the excellent results achieved in this study, it is hoped that this nursing project can serve as a benchmark for other relevant departments.

Keywords: admission care guidance, guidance satisfaction, integrity, Iodine131 isolation

Procedia PDF Downloads 127
25451 Radio Based Location Detection

Authors: M. Pallikonda Rajasekaran, J. Joshapath, Abhishek Prasad Shaw

Abstract:

Various techniques has been employed to find location such as GPS, GLONASS, Galileo, and Beidou (compass). This paper currently deals with finding location using the existing FM signals that operates between 88-108 MHz. The location can be determined based on the received signal strength of nearby existing FM stations by mapping the signal strength values using trilateration concept. Thus providing security to users data and maintains eco-friendly environment at zero installation cost as this technology already existing FM stations operating in commercial FM band 88-108 MHZ. Along with the signal strength based trilateration it also finds azimuthal angle of the transmitter by employing directional antenna like Yagi-Uda antenna at the receiver side.

Keywords: location, existing FM signals, received signal strength, trilateration, security, eco-friendly, direction, privacy, zero installation cost

Procedia PDF Downloads 519
25450 Abnormality Detection of Persons Living Alone Using Daily Life Patterns Obtained from Sensors

Authors: Ippei Kamihira, Takashi Nakajima, Taiyo Matsumura, Hikaru Miura, Takashi Ono

Abstract:

In this research, the goal was construction of a system by which multiple sensors were used to observe the daily life behavior of persons living alone (while respecting their privacy). Using this information to judge such conditions as a bad physical condition or falling in the home, etc., so that these abnormal conditions can be made known to relatives and third parties. The daily life patterns of persons living alone are expressed by the number of responses of sensors each time that a set time period has elapsed. By comparing data for the prior two weeks, it was possible to judge a situation as 'normal' when the person was in a good physical condition or as 'abnormal' when the person was in a bad physical condition.

Keywords: sensors, elderly living alone, abnormality detection, iifestyle habit

Procedia PDF Downloads 253
25449 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud

Authors: N. Nalini, Bhanu Prakash Gopularam

Abstract:

The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.

Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping

Procedia PDF Downloads 384
25448 Artificial Intelligence in Ethiopian Higher Education: The Impact of Digital Readiness Support, Acceptance, Risk, and Trust on Adoption

Authors: Merih Welay Welesilassie

Abstract:

Understanding educators' readiness to incorporate AI tools into their teaching methods requires comprehensively examining the influencing factors. This understanding is crucial, given the potential of these technologies to personalise learning experiences, improve instructional effectiveness, and foster innovative pedagogical approaches. This study evaluated factors affecting teachers' adoption of AI tools in their English language instruction by extending the Technology Acceptance Model (TAM) to encompass digital readiness support, perceived risk, and trust. A cross-sectional quantitative survey was conducted with 128 English language teachers, supplemented by qualitative data collection from 15 English teachers. The structural mode analysis indicated that implementing AI tools in Ethiopian higher education was notably influenced by digital readiness support, perceived ease of use, perceived usefulness, perceived risk, and trust. Digital readiness support positively impacted perceived ease of use, usefulness, and trust while reducing safety and privacy risks. Perceived ease of use positively correlated with perceived usefulness but negatively influenced trust. Furthermore, perceived usefulness strengthened trust in AI tools, while perceived safety and privacy risks significantly undermined trust. Trust was crucial in increasing educators' willingness to adopt AI technologies. The qualitative analysis revealed that the teachers exhibited strong content and pedagogical knowledge but needed more technology-related knowledge. Moreover, It was found that the teachers did not utilise digital tools to teach English. The study identified several obstacles to incorporating digital tools into English lessons, such as insufficient digital infrastructure, a shortage of educational resources, inadequate professional development opportunities, and challenging policies and governance. The findings provide valuable guidance for educators, inform policymakers about creating supportive digital environments, and offer a foundation for further investigation into technology adoption in educational settings in Ethiopia and similar contexts.

Keywords: digital readiness support, AI acceptance, perceived risc, AI trust

Procedia PDF Downloads 18
25447 Functional Instruction Set Simulator (ISS) of a Neural Network (NN) IP with Native BF-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A Functional Model to mimic the functional correctness of a Neural Network Compute Accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of gcc compilers to BF-16 datatype, which we addressed with a native BF-16 generator integrated to our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex Neural Network Accelerator design by proposing a Functional Model-based scoreboard or Software model using SystemC. The proposed Functional Model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT bringing up micro-steps of execution.

Keywords: ISA (instruction set architecture), NN (neural network), TLM (transaction-level modeling), GEMM (general matrix multiplication)

Procedia PDF Downloads 86
25446 Reducing the Computational Cost of a Two-way Coupling CFD-FEA Model via a Multi-scale Approach for Fire Determination

Authors: Daniel Martin Fellows, Sean P. Walton, Jennifer Thompson, Oubay Hassan, Kevin Tinkham, Ella Quigley

Abstract:

Structural integrity for cladding products is a key performance parameter, especially concerning fire performance. Cladding products such as PIR-based sandwich panels are tested rigorously, in line with industrial standards. Physical fire tests are necessary to ensure the customer's safety but can give little information about critical behaviours that can help develop new materials. Numerical modelling is a tool that can help investigate a fire's behaviour further by replicating the fire test. However, fire is an interdisciplinary problem as it is a chemical reaction that behaves fluidly and impacts structural integrity. An analysis using Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) is needed to capture all aspects of a fire performance test. One method is a two-way coupling analysis that imports the updated changes in thermal data, due to the fire's behaviour, to the FEA solver in a series of iterations. In light of our recent work with Tata Steel U.K using a two-way coupling methodology to determine the fire performance, it has been shown that a program called FDS-2-Abaqus can make predictions of a BS 476 -22 furnace test with a degree of accuracy. The test demonstrated the fire performance of Tata Steel U.K Trisomet product, a Polyisocyanurate (PIR) based sandwich panel used for cladding. Previous works demonstrated the limitations of the current version of the program, the main limitation being the computational cost of modelling three Trisomet panels, totalling an area of 9 . The computational cost increases substantially, with the intention to scale up to an LPS 1181-1 test, which includes a total panel surface area of 200 .The FDS-2-Abaqus program is developed further within this paper to overcome this obstacle and better accommodate Tata Steel U.K PIR sandwich panels. The new developments aim to reduce the computational cost and error margin compared to experimental data. One avenue explored is a multi-scale approach in the form of Reduced Order Modeling (ROM). The approach allows the user to include refined details of the sandwich panels, such as the overlapping joints, without a computationally costly mesh size.Comparative studies will be made between the new implementations and the previous study completed using the original FDS-2-ABAQUS program. Validation of the study will come from physical experiments in line with governing body standards such as BS 476 -22 and LPS 1181-1. The physical experimental data includes the panels' gas and surface temperatures and mechanical deformation. Conclusions are drawn, noting the new implementations' impact factors and discussing the reasonability for scaling up further to a whole warehouse.

Keywords: fire testing, numerical coupling, sandwich panels, thermo fluids

Procedia PDF Downloads 79
25445 Secure E-Pay System Using Steganography and Visual Cryptography

Authors: K. Suganya Devi, P. Srinivasan, M. P. Vaishnave, G. Arutperumjothi

Abstract:

Today’s internet world is highly prone to various online attacks, of which the most harmful attack is phishing. The attackers host the fake websites which are very similar and look alike. We propose an image based authentication using steganography and visual cryptography to prevent phishing. This paper presents a secure steganographic technique for true color (RGB) images and uses Discrete Cosine Transform to compress the images. The proposed method hides the secret data inside the cover image. The use of visual cryptography is to preserve the privacy of an image by decomposing the original image into two shares. Original image can be identified only when both qualified shares are simultaneously available. Individual share does not reveal the identity of the original image. Thus, the existence of the secret message is hard to be detected by the RS steganalysis.

Keywords: image security, random LSB, steganography, visual cryptography

Procedia PDF Downloads 330