Search results for: data quality filtering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 31930

Search results for: data quality filtering

29140 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 200
29139 Standardization of the Roots of Gnidia stenophylla Gilg: A Potential Medicinal Plant of South Eastern Ethiopia Traditionally Used as an Antimalarial

Authors: Mebruka Mohammed, Daniel Bisrat, Asfaw Debella, Tarekegn Birhanu

Abstract:

Lack of quality control standards for medicinal plants and their preparations is considered major barrier to their integration in to effective primary health care in Ethiopia. Poor quality herbal preparations led to countless adverse reactions extending to death. Denial of penetration for the Ethiopian medicinal plants in to the world’s booming herbal market is also another significant loss resulting from absence of herbal quality control system. Thus, in the present study, Gnidia stenophylla Gilg (popular antimalarial plant of south eastern Ethiopia), is standardized and a full monograph is produced that can serve as a guideline in quality control of the crude drug. Morphologically, the roots are found to be cylindrical and tapering towards the end. It has a hard, corky and friable touch with saddle brown color externally and it is relatively smooth and pale brown internally. It has got characteristic pungent odor and very bitter taste. Microscopically it has showed lignified xylem vessels, wider medullary rays with some calcium oxalate crystals, reddish brown secondary metabolite contents and slender shaped long fibres. Physicochemical standards quantified and resulted: foreign matter (5.25%), moisture content (6.69%), total ash (40.80%), acid insoluble ash (8.00%), water soluble ash (2.30%), alcohol soluble extractive (15.27%), water soluble extractive (10.98%), foaming index (100.01 ml/g), swelling index (7.60 ml/g). Phytochemically: Phenols, flavonoids, steroids, tannins and saponins were detected in the root extract; TLC and HPLC fingerprints were produced and an analytical marker was also tentatively characterized as 3-(3,4-dihydro-3,5-dihydroxy-2-(4-hydroxy-5-methylhex-1-en-2-yl)-7-methoxy-4-oxo-2H-chromen-8-yl)-5-hydroxy-2-(4-hydroxyphenyl)-7-methoxy-4H-chromen-4-one. Residue wise pesticides (i.e. DDT, DDE, g-BHC) and radiochemical levels fall below the WHO limit while Heavy metals (i.e. Co, Ni, Cr, Pb, and Cu), total aerobic count and fungal load lie way above the WHO limit. In conclusion, the result can be taken as signal that employing non standardized medicinal plants could cause many health risks of the Ethiopian people and Africans’ at large (as 80% of inhabitants in the continent depends on it for primary health care). Therefore, following a more universal approach to herbal quality by adopting the WHO guidelines and developing monographs using the various quality parameters is inevitable to minimize quality breach and promote effective herbal drug usage.

Keywords: Gnidia stenophylla Gilg, standardization/monograph, pharmacognostic, residue/impurity, quality

Procedia PDF Downloads 297
29138 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining

Procedia PDF Downloads 173
29137 Modeling the Effects of Leachate-Impacted Groundwater on the Water Quality of a Large Tidal River

Authors: Emery Coppola Jr., Marwan Sadat, Il Kim, Diane Trube, Richard Kurisko

Abstract:

Contamination sites like landfills often pose significant risks to receptors like surface water bodies. Surface water bodies are often a source of recreation, including fishing and swimming, which not only enhances their value but also serves as a direct exposure pathway to humans, increasing their need for protection from water quality degradation. In this paper, a case study presents the potential effects of leachate-impacted groundwater from a large closed sanitary landfill on the surface water quality of the nearby Raritan River, situated in New Jersey. The study, performed over a two year period, included in-depth field evaluation of both the groundwater and surface water systems, and was supplemented by computer modeling. The analysis required delineation of a representative average daily groundwater discharge from the Landfill shoreline into the large, highly tidal Raritan River, with a corresponding estimate of daily mass loading of potential contaminants of concern. The average daily groundwater discharge into the river was estimated from a high-resolution water level study and a 24-hour constant-rate aquifer pumping test. The significant tidal effects induced on groundwater levels during the aquifer pumping test were filtered out using an advanced algorithm, from which aquifer parameter values were estimated using conventional curve match techniques. The estimated hydraulic conductivity values obtained from individual observation wells closely agree with tidally-derived values for the same wells. Numerous models were developed and used to simulate groundwater contaminant transport and surface water quality impacts. MODFLOW with MT3DMS was used to simulate the transport of potential contaminants of concern from the down-gradient edge of the Landfill to the Raritan River shoreline. A surface water dispersion model based upon a bathymetric and flow study of the river was used to simulate the contaminant concentrations over space within the river. The modeling results helped demonstrate that because of natural attenuation, the Landfill does not have a measurable impact on the river, which was confirmed by an extensive surface water quality study.

Keywords: groundwater flow and contaminant transport modeling, groundwater/surface water interaction, landfill leachate, surface water quality modeling

Procedia PDF Downloads 265
29136 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 203
29135 Perusing the Influence of a Visual Editor in Enabling PostgreSQL Query Learn-Ability

Authors: Manuela Nayantara Jeyaraj

Abstract:

PostgreSQL is an Object-Relational Database Management System (ORDBMS) with an architecture that ensures optimal quality data management. But due to the shading growth of similar ORDBMS, PostgreSQL has not been renowned among the database user community. Despite having its features and in-built functionalities shadowed, PostgreSQL renders a vast range of utilities for data manipulation and hence calling for it to be upheld more among users. But introducing PostgreSQL in order to stimulate its advantageous features among users, mandates endorsing learn-ability as an add-on as the target groups considered consist of both amateur as well as professional PostgreSQL users. The scope of this paper deliberates providing easy contemplation of query formulations and flows through a visual editor designed according to user interface principles that standby to support every aspect of making PostgreSQL learn-able by self-operation and creation of queries within the visual editor. This paper tends to scrutinize the importance of choosing PostgreSQL as the working database environment, the visual perspectives that influence human behaviour and ultimately learning, the modes in which learn-ability can be provided via visualization and the advantages reaped by the implementation of the proposed system features.

Keywords: database, learn-ability, PostgreSQL, query, visual-editor

Procedia PDF Downloads 178
29134 The Secret Ingredient of Student Involvement: Applied Science Case Studies to Enhance Sustainability

Authors: Elizelle Juanee Cilliers

Abstract:

Recent planning thinking has laid the foundations for a general sense of best practice that aims to enhance the quality of life, suggesting an open and participatory process. It is accepted that integration of top-down and bottom-up approaches may lead to efficient action in environments and sustainable planning and development, although it is also accepted that such an integrated approach has various challenges of implementation. A flexible framework in which the strengths of both the top-down and bottom-up approaches were explored in this research, based on the EU Interreg VALUE Added project and five case studies where student education and student involvement played a crucial role within the participation process of the redesign of the urban environment. It was found that international student workshops were an effective tool to integrate bottom-up and top-down structures, as it acted as catalyst for communication, interaction, creative design, quick transformation from planning to implementation, building social cohesion, finding mutual ground between stakeholders and thus enhancing overall quality of life and quality of environments. It offered a good alternative to traditional participation modes and created a platform for an integrative planning approach. The role and importance of education and integration within the urban environment were emphasized.

Keywords: top-down, bottom-up, flexible, student involvement

Procedia PDF Downloads 212
29133 Pre-harvest Application of Nutrients on Quality and Storability of Litchi CV Bombai

Authors: Nazmin Akter, Tariqul Islam, Abu Sayed

Abstract:

Food loss and waste have become critical global issues, with approximately one-third of the world's food production being wasted. Among the various food products, horticultural fruits and vegetables are especially susceptible to loss due to their relatively short shelf lives. Litchi (Litchi chinensis) is one of Bangladesh's most important horticultural fruits. But the problem with this fruit is its short shelf life by losing weight faster after harvest. The experiment was carried out at Hajee Mohammad Danesh Science and Technology University, Dinajpur-5200 Bangladesh during 2020-2021. The objective of this experiment was to see the impact of nutrients viz., urea (1%), calcium chloride (1%), borax (1%), and their combinations on fruit quality and shelf life of litchi cv. Bombai. The experiment was laid out in a randomized block design with 7 treatments and 3 replications. Two sprays of each treatment were applied from the last week of May to June (at 20-day intervals). The results indicated that all the treatments significantly improved the quality parameters of litchi fruits as compared to the control. In terms of physicochemical characteristics fruit weight (20.30g), fruit volume (20m ml), and pulp percent (17.14) were found maximum with minimum stone percent (11.09) with the application of urea 1% + borax 1%+ calcium chloride 1%. Maximum TSS (19.62oBrix), TSS/acidity ratio (24.57), maximum ascorbic acid (45.19 mg/100 g pulp), and minimum acidity (0.80%) were reported with the application of T6 (Urea 1% + borax 1%+ calcium chloride 1%) treatments whereas fruits treated with urea 1% + borax 1% gave maximum total sugars (26.64%) and reducing sugars (19.19%) as compared to control. In the case of storage characters, application of Urea 1% + borax 1%+ calcium chloride 1% resulted in a minimum physiological loss in weight (6.11%), (8.41%), and (10.65%) for 2 days, 4 days, and 6 days respectively. In conclusion, to obtain better quality and increased storage period of litchi fruits, two sprays of urea, borax, and calcium chloride (1%) could be used during the fruit growth and development period at fortnightly intervals.

Keywords: litchi chinensis, preharvest, quality, shelf life, postharvest

Procedia PDF Downloads 74
29132 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 25
29131 Key Success Factors for Malaysian SMES Companies’ Entrepreneurial Leader

Authors: Zainal Abu Zarim, Hafizah Omar Zaki

Abstract:

The objective of this study is to analyse the success factors of entrepreneurs in the Malaysian SMEs in the urge to discover their entrepreneurial leadership characteristics. Data has been collected from top 50 SME award winning companies. The study has used the qualitative approach to data collection, where interviews are dispersed on these selected companies. From these 50 SMEs, only 25 accepted the interview request where one entrepreneur from each SME answered the questions. To successfully run this study, we administered some questions based on Hornaday 42 characteristics of an entrepreneurs, as well some structured questions to determine a successful of a company. The result shows that, entrepreneurs are confident, determine, diligent, flexible, responsive to challenges, responsible, foresight, courageous, aggressive, and committed. Consistent to this, several elements that makes the company successful includes (1) strong financial control, (2) continuous improvement, (3) product quality and product safety as top priority, (4) hard work and team work, and (5) eagerness in taking challenges. These results has deemed that entrepreneurs in many aspects are also leaders that are risk averse and determine, and are eager to work on continuous improvement in a financially strong company.

Keywords: characteristics of entrepreneurs, success of a company, key success factors, Malaysian SMEs

Procedia PDF Downloads 594
29130 A Combination of Filtration and Coagulation Processes for Tannery Effluent Treatment

Authors: M. G. Mostafa, Manjushree Chowdhury, Tapan Kumar Biswas, , Ananda Kumar Saha

Abstract:

This study focused on effluents characterization and treatment process to reduce of toxicity from tannery effluents. Tanning industry is one of the oldest industries in the world. It is typically characterized as pollutants generated industries which produce wide varieties of high strength toxic chemicals. The study was conducted during the year 2008 to 2009 and the tannery effluents were collected three times in a year from the outlet of some selected leather industries located in Hagaribagh industrial zone Dhaka, Bangladesh. The analysis results of the raw effluents reveal that the effluents were yellowish-brown color, having basic pH, very high value of BOD5¬¬, COD, TDS, TSS, TS, and high concentrations of Cr, Na, SO42-, Cl- and other organic and inorganic constituents. The tannery effluents were treated with various doses of FeCl3 after settling and a subsequent filtration through sand-stone. The study observed that coagulant (FeCl3) 150 mg/L dose around neutral pH showed the best removal efficiency for major physico-chemical parameters. The analysis results of illustrate that the most of the physical and chemical parameters were found well below the prescribed permissible limits for effluent discharged. The study suggests that tannery effluents could be treated by a combined process consisting of settling, filtering and coagulating with FeCl3.

Keywords: characterization, effluent, tannery, treatment

Procedia PDF Downloads 453
29129 Experimental Study on the Heating Characteristics of Transcritical CO₂ Heat Pumps

Authors: Lingxiao Yang, Xin Wang, Bo Xu, Zhenqian Chen

Abstract:

Due to its outstanding environmental performance, higher heating temperature and excellent low-temperature performance, transcritical carbon dioxide (CO₂) heat pumps are receiving more and more attention. However, improperly set operating parameters have a serious negative impact on the performance of the transcritical CO₂ heat pump due to the properties of CO₂. In this study, the heat transfer characteristics of the gas cooler are studied based on the modified “three-stage” gas cooler, then the effect of three operating parameters, compressor speed, gas cooler water-inlet flowrate and gas cooler water-inlet temperature, on the heating process of the system are investigated from the perspective of thermal quality and heat capacity. The results shows that: In the heat transfer process of gas cooler, the temperature distribution of CO₂ and water shows a typical “two region” and “three zone” pattern; The rise in the cooling pressure of CO₂ serves to increase the thermal quality on the CO₂ side of the gas cooler, which in turn improves the heating temperature of the system; Nevertheless, the elevated thermal quality on the CO₂ side can exacerbate the mismatch of heat capacity on both sides of the gas cooler, thereby adversely affecting the system coefficient of performance (COP); Furthermore, increasing compressor speed mitigates the mismatch in heat capacity caused by elevated thermal quality, which is exacerbated by decreasing gas cooler water-inlet flowrate and rising gas cooler water-inlet temperature; As a delegate, the varying compressor speed results in a 7.1°C increase in heating temperature within the experimental range, accompanied by a 10.01% decrease in COP and an 11.36% increase in heating capacity. This study can not only provide an important reference for the theoretical analysis and control strategy of the transcritical CO₂ heat pump, but also guide the related simulation and the design of the gas cooler. However, the range of experimental parameters in the current study is small and the conclusions drawn are not further analysed quantitatively. Therefore, expanding the range of parameters studied and proposing corresponding quantitative conclusions and indicators with universal applicability could greatly increase the practical applicability of this study. This is also the goal of our next research.

Keywords: transcritical CO₂ heat pump, gas cooler, heat capacity, thermal quality

Procedia PDF Downloads 31
29128 Pre-Processing of Ultrasonography Image Quality Improvement in Cases of Cervical Cancer Using Image Enhancement

Authors: Retno Supriyanti, Teguh Budiono, Yogi Ramadhani, Haris B. Widodo, Arwita Mulyawati

Abstract:

Cervical cancer is the leading cause of mortality in cancer-related diseases. In this diagnosis doctors usually perform several tests to determine the presence of cervical cancer in a patient. However, these checks require support equipment to get the results in more detail. One is by using ultrasonography. However, for the developing countries most of the existing ultrasonography has a low resolution. The goal of this research is to obtain abnormalities on low-resolution ultrasound images especially for cervical cancer case. In this paper, we emphasize our work to use Image Enhancement for pre-processing image quality improvement. The result shows that pre-processing stage is promising to support further analysis.

Keywords: cervical cancer, mortality, low-resolution, image enhancement.

Procedia PDF Downloads 642
29127 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 277
29126 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast

Authors: Ruixia Liu

Abstract:

Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.

Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI

Procedia PDF Downloads 238
29125 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models

Authors: Benbiao Song, Yan Gao, Zhuo Liu

Abstract:

Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.

Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram

Procedia PDF Downloads 269
29124 Novel Use of a Quality Assurance Tool for Integrating Technology to HSE

Authors: Ragi Poyyara, V. Vivek, Ashish Khaparde

Abstract:

The Product Development Process (PDP) in the technology group plays a very important role in the launch of any product. While a manufacturing process encourages the use of certain measures to reduce Health, Safety and Environmental (HSE) risks on the shop floor, the PDP concentrates on the use of Geometric Dimensioning and Tolerancing (GD&T) to develop a flawless design. Furthermore, PDP distributes and coordinates activities between different departments such as marketing, purchasing, and manufacturing. However, it is seldom realized that PDP makes a significant contribution to developing a product that reduces HSE risks by encouraging the Technology group to use effective GD&T. The GD&T is a precise communication tool that uses a set of symbols, rules, and definitions to mathematically define parts to be manufactured. It is a quality assurance method widely used in the oil and gas sector. Traditionally it is used to ensure the interchangeability of a part without affecting its form, fit, and function. Parts that do not meet these requirements are rejected during quality audits. This paper discusses how the Technology group integrates this quality assurance tool into the PDP and how the tool plays a major role in helping the HSE department in its goal towards eliminating HSE incidents. The PDP involves a thorough risk assessment and establishes a method to address those risks during the design stage. An illustration shows how GD&T helped reduce safety risks by ergonomically improving assembling operations. A brief discussion explains how tolerances provided on a part help prevent finger injury. This tool has equipped Technology to produce fixtures, which are used daily in operations as well as manufacturing. By applying GD&T to create good fits, HSE risks are mitigated for operating personnel. Both customers and service providers benefit from reduced safety risks.

Keywords: HSE risks, product development process, geometric dimensioning and tolerances, mechanical engineering

Procedia PDF Downloads 230
29123 Methodologies for Crack Initiation in Welded Joints Applied to Inspection Planning

Authors: Guang Zou, Kian Banisoleiman, Arturo González

Abstract:

Crack initiation and propagation threatens structural integrity of welded joints and normally inspections are assigned based on crack propagation models. However, the approach based on crack propagation models may not be applicable for some high-quality welded joints, because the initial flaws in them may be so small that it may take long time for the flaws to develop into a detectable size. This raises a concern regarding the inspection planning of high-quality welded joins, as there is no generally acceptable approach for modeling the whole fatigue process that includes the crack initiation period. In order to address the issue, this paper reviews treatment methods for crack initiation period and initial crack size in crack propagation models applied to inspection planning. Generally, there are four approaches, by: 1) Neglecting the crack initiation period and fitting a probabilistic distribution for initial crack size based on statistical data; 2) Extrapolating the crack propagation stage to a very small fictitious initial crack size, so that the whole fatigue process can be modeled by crack propagation models; 3) Assuming a fixed detectable initial crack size and fitting a probabilistic distribution for crack initiation time based on specimen tests; and, 4) Modeling the crack initiation and propagation stage separately using small crack growth theories and Paris law or similar models. The conclusion is that in view of trade-off between accuracy and computation efforts, calibration of a small fictitious initial crack size to S-N curves is the most efficient approach.

Keywords: crack initiation, fatigue reliability, inspection planning, welded joints

Procedia PDF Downloads 356
29122 Implications on the Training Program for Clinical Psychologists in South Korea

Authors: Chorom Baek, Sungwon Choi

Abstract:

The purpose of this study is to analyze the supervision system, and the training and continuing education of mental health professionals in USA, UK, Australia (New Zealand), Japan, and so on, and to deduce the implications of Korean mental health service system. In order to accomplish the purpose of this study, following methodologies were adopted: review on the related literatures, statistical data, the related manuals, online materials, and previous studies concerning issues in those countries for the past five years. The training program in Korea was compared with the others’ through this literature analysis. The induced matters were divided with some parts such as training program, continuing education, educational procedure, and curriculum. Based on the analysis, discussion and implications, the conclusion and further suggestion of this study are as follows: First, Korean Clinical Psychology of Association (KCPA) should become more powerful health main training agency for quality control. Second, actual authority of health main training agency should be a grant to training centers. Third, quality control of mental health professionals should be through standardization and systemization of promotion and qualification management. Fourth, education and training about work of supervisors and unification of criteria for supervision should be held. Fifth, the training program for mental health license should be offered by graduate schools. Sixth, legitimated system to protect the right of mental health trainees is needed. Seventh, regularly continuing education after licensed should be compulsory to keep the certification. Eighth, the training program in training centers should meet KCPA requirement. If not, KCPA can cancel the certification of the centers.

Keywords: clinical psychology, Korea, mental health system, training program

Procedia PDF Downloads 230
29121 Error Analysis of Wavelet-Based Image Steganograhy Scheme

Authors: Geeta Kasana, Kulbir Singh, Satvinder Singh

Abstract:

In this paper, a steganographic scheme for digital images using Integer Wavelet Transform (IWT) is proposed. The cover image is decomposed into wavelet sub bands using IWT. Each of the subband is divided into blocks of equal size and secret data is embedded into the largest and smallest pixel values of each block of the subband. Visual quality of stego images is acceptable as PSNR between cover image and stego is above 40 dB, imperceptibility is maintained. Experimental results show better tradeoff between capacity and visual perceptivity compared to the existing algorithms. Maximum possible error analysis is evaluated for each of the wavelet subbands of an image.

Keywords: DWT, IWT, MSE, PSNR

Procedia PDF Downloads 510
29120 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies

Authors: Sook Ching Yee, Angela Siew Hoong Lee

Abstract:

Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.

Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)

Procedia PDF Downloads 365
29119 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 502
29118 Adolescent Health Risk Behaviors and the Mediating Effects of Family Dynamics and Socio-Demographic Factors

Authors: Rufina C. Abul, Dylan Kyle D. Apostol, Darius Rex G. Binuya, Alyanah Mae F. Cauilan, Darren A. Diaz, Angelica Jones A. Gallang, Charisse G. Kiwang, Alyanna Nicole G. Mactal, Nadine Beatrize V. Nerona, Janella Nicole R. Posadas, Charisse Purie C. Toledo

Abstract:

Background: Dramatic physical development, socioemotional adjustment, and cognitive changes highlight adolescent development. Adolescent brains are susceptible to emotional reactivity, making them likely to engage in risk-taking and impulsive behaviors. The family is crucial in laying the foundations of good health. Aims: This study determined the degree of family cohesion, quality of father-child and mother-child relationships, and degree of academic pressure across cultures, age groups, and sexual orientations. Further, it sought the prevalence of adolescent health concerns, including suicide risks, risk-taking behaviors, social media engagement, and self-care deviations. Finally, the correlations between health risk behaviors and the elements of family dynamics were unraveled. Methods: The descriptive-correlational design served as the blueprint for this study. Data were collected from 1095 adolescents aged 12-21 in two high schools and two universities in Baguio City using self-report questionnaires. Data was analyzed using Microsoft Excel Toolpak and IBM SPSS Statistics to identify significant differences and relationships among variables through descriptive statistics (frequency, %, means and figures) and inferential statistics (ANOVA and logistic regression). Results and Discussion: Adolescents generally have strong family cohesion (FC), high-quality father-child relationships (F-CR), very high-quality mother-child relationships(M-CR), and experience high academic pressure (AP). Cultural affiliation does not influence the 4 elements of family dynamics; the higher the age, the stronger the family cohesion; males score significantly higher on family cohesion and mother-child relationship while significantly lower in perceived academic pressure compared to their female and LGBT counterparts. Suicide risk is prevalent among 29-63% of the population, safety issues have the lowest prevalence for having an abusive relationship (8.22%) and the highest for encountering major family changes (53.52%). Substance use was highest for vaping (22.74%), sexual engagement occurs in 14.61% of the population, while 63% are engaged in social media for >5 hours/day. The self-care deviation is highest for weight concerns (63.39%), lack of visits to health care professionals (64.65%) and lack of exercise (49.94%). All 4 elements of family dynamic (FC, F-CR, M-CR and AP) are significantly associated with safety concerns, suicide risks and social media engagement, while M-CR significantly influences cigarette smoking, alcohol drinking, rugby use and engagement in sex. Conclusion and Recommendations: Strong family cohesion and quality parent-child interactions improve emotional and behavioral outcomes. Sexual orientation has a significant impact on academic pressure and social media use, demanding targeted treatments. The link between family dynamics and health-risk behaviors emphasizes the importance of promoting positive family relationships and encouraging safer behaviors, which are critical for increasing adolescents' well-being.

Keywords: adolescent health, family cohesion, health risk behaviors, suicide risk

Procedia PDF Downloads 22
29117 Utilising Indigenous Knowledge to Design Dykes in Malawi

Authors: Martin Kleynhans, Margot Soler, Gavin Quibell

Abstract:

Malawi is one of the world’s poorest nations and consequently, the design of flood risk management infrastructure comes with a different set of challenges. There is a lack of good quality hydromet data, both in spatial terms and in the quality thereof and the challenge in the design of flood risk management infrastructure is compounded by the fact that maintenance is almost completely non-existent and that solutions have to be simple to be effective. Solutions should not require any further resources to remain functional after completion, and they should be resilient. They also have to be cost effective. The Lower Shire Valley of Malawi suffers from frequent flood events. Various flood risk management interventions have been designed across the valley during the course of the Shire River Basin Management Project – Phase I, and due to the data poor environment, indigenous knowledge was relied upon to a great extent for hydrological and hydraulic model calibration and verification. However, indigenous knowledge comes with the caveat that it is ‘fuzzy’ and that it can be manipulated for political reasons. The experience in the Lower Shire valley suggests that indigenous knowledge is unlikely to invent a problem where none exists, but that flood depths and extents may be exaggerated to secure prioritization of the intervention. Indigenous knowledge relies on the memory of a community and cannot foresee events that exceed past experience, that could occur differently to those that have occurred in the past, or where flood management interventions change the flow regime. This complicates communication of planned interventions to local inhabitants. Indigenous knowledge is, for the most part, intuitive, but flooding can sometimes be counter intuitive, and the rural poor may have a lower trust of technology. Due to a near complete lack of maintenance of infrastructure, infrastructure has to be designed with no moving parts and no requirement for energy inputs. This precludes pumps, valves, flap gates and sophisticated warning systems. Designs of dykes during this project included ‘flood warning spillways’, that double up as pedestrian and animal crossing points, which provide warning of impending dangerous water levels behind dykes to residents before water levels that could cause a possible dyke failure are reached. Locally available materials and erosion protection using vegetation were used wherever possible to keep costs down.

Keywords: design of dykes in low-income countries, flood warning spillways, indigenous knowledge, Malawi

Procedia PDF Downloads 289
29116 Security in Resource Constraints Network Light Weight Encryption for Z-MAC

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless sensor network was formed by a combination of nodes, systematically it transmitting the data to their base stations, this transmission data can be easily compromised if the limited processing power and the data consistency from these nodes are kept in mind; there is always a discussion to address the secure data transfer or transmission in actual time. This will present a mechanism to securely transmit the data over a chain of sensor nodes without compromising the throughput of the network by utilizing available battery resources available in the sensor node. Our methodology takes many different advantages of Z-MAC protocol for its efficiency, and it provides a unique key by sharing the mechanism using neighbor node MAC address. We present a light weighted data integrity layer which is embedded in the Z-MAC protocol to prove that our protocol performs well than Z-MAC when we introduce the different attack scenarios.

Keywords: hybrid MAC protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node dataprocessing, Z-MAC

Procedia PDF Downloads 148
29115 In-Depth Investigations on the Sequences of Accidents of Powered Two Wheelers Based on Police Crash Reports of Medan, North Sumatera Province Indonesia, Using Decision Aiding Processes

Authors: Bangun F., Crevits B., Bellet T., Banet A., Boy G. A., Katili I.

Abstract:

This paper seeks the incoherencies in cognitive process during an accident of Powered Two Wheelers (PTW) by understanding the factual sequences of events and causal relations for each case of accident. The principle of this approach is undertaking in-depth investigations on case per case of PTW accidents based on elaborate data acquisitions on accident sites that officially stamped in Police Crash Report (PCRs) 2012 of Medan with criteria, involved at least one PTW and resulted in serious injury and fatalities. The analysis takes into account four modules: accident chronologies, perpetrator, and victims, injury surveillance, vehicles and road infrastructures, comprising of traffic facilities, road geometry, road alignments and weather. The proposal for improvement could have provided a favorable influence on the chain of functional processes and events leading to collision. Decision Aiding Processes (DAP) assists in structuring different entities at different decisional levels, as each of these entities has its own objectives and constraints. The entities (A) are classified into 6 groups of accidents: solo PTW accidents; PTW vs. PTW; PTW vs. pedestrian; PTW vs. motor-trishaw; and PTW vs. other vehicles and consecutive crashes. The entities are also distinguished into 4 decisional levels: level of road users and street systems; operational level (crash-attended police officers or CAPO and road engineers), tactical level (Regional Traffic Police, Department of Transportation, and Department of Public Work), and strategic level (Traffic Police Headquarters (TCPHI)), parliament, Ministry of Transportation and Ministry of Public Work). These classifications will lead to conceptualization of Problem Situations (P) and Problem Formulations (I) in DAP context. The DAP concerns the sequences process of the incidents until the time the accident occurs, which can be modelled in terms of five activities of procedural rationality: identification on initial human features (IHF), investigation on proponents attributes (PrAT), on Injury Surveillance (IS), on the interaction between IHF and PrAt and IS (intercorrelation), then unravel the sequences of incidents; filtering and disclosure, which include: what needs to activate, modify or change or remove, what is new and what is priority. These can relate to the activation or modification or new establishment of law. The PrAt encompasses the problems of environmental, road infrastructure, road and traffic facilities, and road geometry. The evaluation model (MP) is generated to bridge P and I since MP is produced by the intercorrelations among IHF, PrAT and IS extracted from the PCRs 2012 of Medan. There are 7 findings of incoherences: lack of knowledge and awareness on the traffic regulations and the risks of accidents, especially when riding between 0 < x < 10 km from house, riding between 22 p.m.–05.30 a.m.; lack of engagements on procurement of IHF Data by CAPO; lack of competency of CAPO on data procurement in accident-sites; no intercorrelation among IHF and PrAt and IS in the database systems of PCRs; lack of maintenance and supervision on the availabilities and the capacities of traffic facilities and road infrastructure; instrumental bias with wash-back impacts towards the TCPHI; technical robustness with wash-back impacts towards the CAPO and TCPHI.

Keywords: decision aiding processes, evaluation model, PTW accidents, police crash reports

Procedia PDF Downloads 162
29114 Symphony of Healing: Exploring Music and Art Therapy’s Impact on Chemotherapy Patients with Cancer

Authors: Sunidhi Sood, Drashti Narendrakumar Shah, Aakarsh Sharma, Nirali Harsh Panchal, Maria Karizhenskaia

Abstract:

Cancer is a global health concern, causing a significant number of deaths, with chemotherapy being a standard treatment method. However, chemotherapy often induces side effects that profoundly impact the physical and emotional well-being of patients, lowering their overall quality of life (QoL). This research aims to investigate the potential of music and art therapy as holistic adjunctive therapy for cancer patients undergoing chemotherapy, offering non-pharmacological support. This is achieved through a comprehensive review of existing literature with a focus on the following themes, including stress and anxiety alleviation, emotional expression and coping skill development, transformative changes, and pain management with mood upliftment. A systematic search was conducted using Medline, Google Scholar, and St. Lawrence College Library, considering original, peer-reviewed research papers published from 2014 to 2023. The review solely incorporated studies focusing on the impact of music and art therapy on the health and overall well-being of cancer patients undergoing chemotherapy in North America. The findings from 16 studies involving pediatric oncology patients, females affected by breast cancer, and general oncology patients show that music and art therapies significantly reduce anxiety (standardized mean difference: -1.10) and improve perceived stress (median change: -4.0) and overall quality of life in cancer patients undergoing chemotherapy. Furthermore, music therapy has demonstrated the potential to decrease anxiety, depression, and pain during infusion treatments (average changes in resilience scale: 3.4 and 4.83 for instrumental and vocal music therapy, respectively). This data calls for consideration of the integration of music and art therapy into supportive care programs for cancer patients undergoing chemotherapy. Moreover, it provides guidance to healthcare professionals and policymakers, facilitating the development of patient-centered strategies for cancer care in Canada. Further research is needed in collaboration with qualified therapists to examine its applicability and explore and evaluate patients' perceptions and expectations in order to optimize the therapeutic benefits and overall patient experience. In conclusion, integrating music and art therapy in cancer care promises to substantially enhance the well-being and psychosocial state of patients undergoing chemotherapy. However, due to the small population size considered in existing studies, further research is needed to bridge the knowledge gap and ensure a comprehensive, patient-centered approach, ultimately enhancing the quality of life (QoL) for individuals facing the challenges of cancer treatment.

Keywords: anxiety, cancer, chemotherapy, depression, music and art therapy, pain management, quality of life

Procedia PDF Downloads 81
29113 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 409
29112 DWDM Network Implementation in the Honduran Telecommunications Company "Hondutel"

Authors: Tannia Vindel, Carlos Mejia, Damaris Araujo, Carlos Velasquez, Darlin Trejo

Abstract:

The DWDM (Dense Wavelenght Division Multiplexing) is in constant growth around the world by consumer demand to meet their needs. Since its inception in this operation arises the need for a system which enable us to expand the communication of an entire nation to improve the computing trends of their societies according to their customs and geographical location. The Honduran Company of Telecommunications (HONDUTEL), provides the internet services and data transport technology with a PDH and SDH, which represents in the Republic of Honduras C. A., the option of viability for the consumer in terms of purchase value and its ease of acquisition; but does not have the efficiency in terms of technological advance and represents an obstacle that limits the long-term socio-economic development in comparison with other countries in the region and to be able to establish a competition between telecommunications companies that are engaged in this heading. For that reason we propose to establish a new technological trend implemented in Europe and that is applied in our country that allows us to provide a data transfer in broadband as it is DWDM, in this way we will have a stable service and quality that will allow us to compete in this globalized world, and that must be replaced by one that would provide a better service and which must be in the forefront. Once implemented the DWDM is build upon the existing resources, such as the equipment used, and you will be given life to a new stage providing a business image to the Republic of Honduras C,A, as a nation, to ensure the data transport and broadband internet to a meaningful relationship. Same benefits in the first instance to existing customers and to all the institutions were bidden to these public and private need of such services.

Keywords: demultiplexers, light detectors, multiplexers, optical amplifiers, optical fibers, PDH, SDH

Procedia PDF Downloads 269
29111 A Study of Blockchain Oracles

Authors: Abdeljalil Beniiche

Abstract:

The limitation with smart contracts is that they cannot access external data that might be required to control the execution of business logic. Oracles can be used to provide external data to smart contracts. An oracle is an interface that delivers data from external data outside the blockchain to a smart contract to consume. Oracle can deliver different types of data depending on the industry and requirements. In this paper, we study and describe the widely used blockchain oracles. Then, we elaborate on his potential role, technical architecture, and design patterns. Finally, we discuss the human oracle and its key role in solving the truth problem by reaching a consensus about a certain inquiry and tasks.

Keywords: blockchain, oracles, oracles design, human oracles

Procedia PDF Downloads 142