Search results for: Joseph E. Estevez
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 91

Search results for: Joseph E. Estevez

31 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology

Authors: Anjian Chen, Joseph C. Chen

Abstract:

This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.

Keywords: Additive manufacturing, fused deposition modeling, surface roughness, Six-Sigma, Taguchi method, 3D printing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1389
30 Stabilizing Effects of Deep Eutectic Solvents on Alcohol Dehydrogenase Mediated Systems

Authors: Fatima Zohra Ibn Majdoub Hassani, Ivan Lavandera, Joseph Kreit

Abstract:

This study explored the effects of different organic solvents, temperature, and the amount of glycerol on the alcohol dehydrogenase (ADH)-catalysed stereoselective reduction of different ketones. These conversions were then analyzed by gas chromatography. It was found that when the amount of deep eutectic solvents (DES) increases, it can improve the stereoselectivity of the enzyme although reducing its ability to convert the substrate into the corresponding alcohol. Moreover, glycerol was found to have a strong stabilizing effect on the ADH from Ralstonia sp. (E. coli/ RasADH). In the case of organic solvents, it was observed that the best conversions into the alcohols were achieved with DMSO and hexane. It was also observed that temperature decreased the ability of the enzyme to convert the substrates into the products and also affected the selectivity. In addition to that, the recycling of DES up to three times gave good conversions and enantiomeric excess results and glycerol showed a positive effect in the stability of various ADHs. Using RasADH, a good conversion and enantiomeric excess into the S-alcohol were obtained. It was found that an enhancement of the temperature disabled the stabilizing effect of glycerol and decreased the stereoselectivity of the enzyme. However, for other ADHs a temperature increase had an opposite positive effect, especially with ADH-T from Thermoanaerobium sp. One of the objectives of this study was to see the effect of cofactors such as NAD(P) on the biocatlysis activities of ADHs.

Keywords: Alcohol dehydrogenases, DES, gas chromatography, RasADH.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1141
29 Taguchi-Based Optimization of Surface Roughness and Dimensional Accuracy in Wire EDM Process with S7 Heat Treated Steel

Authors: Joseph C. Chen, Joshua Cox

Abstract:

This research focuses on the use of the Taguchi method to reduce the surface roughness and improve dimensional accuracy of parts machined by Wire Electrical Discharge Machining (EDM) with S7 heat treated steel material. Due to its high impact toughness, the material is a candidate for a wide variety of tooling applications which require high precision in dimension and desired surface roughness. This paper demonstrates that Taguchi Parameter Design methodology is able to optimize both dimensioning and surface roughness successfully by investigating seven wire-EDM controllable parameters: pulse on time (ON), pulse off time (OFF), servo voltage (SV), voltage (V), servo feed (SF), wire tension (WT), and wire speed (WS). The temperature of the water in the Wire EDM process is investigated as the noise factor in this research. Experimental design and analysis based on L18 Taguchi orthogonal arrays are conducted. This paper demonstrates that the Taguchi-based system enables the wire EDM process to produce (1) high precision parts with an average of 0.6601 inches dimension, while the desired dimension is 0.6600 inches; and (2) surface roughness of 1.7322 microns which is significantly improved from 2.8160 microns.

Keywords: Taguchi parameter design, surface roughness, dimensional accuracy, Wire EDM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1088
28 AI-Driven Cloud Security: Proactive Defense Against Evolving Cyber Threats

Authors: Ashly Joseph

Abstract:

Cloud computing has become an essential component of enterprises and organizations globally in the current era of digital technology. The cloud has a multitude of advantages, including scalability, flexibility, and cost-effectiveness, rendering it an appealing choice for data storage and processing. The increasing storage of sensitive information in cloud environments has raised significant concerns over the security of such systems. The frequency of cyber threats and attacks specifically aimed at cloud infrastructure has been increasing, presenting substantial dangers to the data, reputation, and financial stability of enterprises. Conventional security methods can become inadequate when confronted with ever intricate and dynamic threats. Artificial Intelligence (AI) technologies possess the capacity to significantly transform cloud security through their ability to promptly identify and thwart assaults, adjust to emerging risks, and offer intelligent perspectives for proactive security actions. The objective of this research study is to investigate the utilization of AI technologies in augmenting the security measures within cloud computing systems. This paper aims to offer significant insights and recommendations for businesses seeking to protect their cloud-based assets by analyzing the present state of cloud security, the capabilities of AI, and the possible advantages and obstacles associated with using AI into cloud security policies.

Keywords: Machine Learning, Natural Learning Processing, Denial-of-Service attacks, Sentiment Analysis, Cloud computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186
27 Material Density Mapping on Deformable 3D Models of Human Organs

Authors: Petru Manescu, Joseph Azencot, Michael Beuve, Hamid Ladjal, Jacques Saade, Jean-Michel Morreau, Philippe Giraud, Behzad Shariat

Abstract:

Organ motion, especially respiratory motion, is a technical challenge to radiation therapy planning and dosimetry. This motion induces displacements and deformation of the organ tissues within the irradiated region which need to be taken into account when simulating dose distribution during treatment. Finite element modeling (FEM) can provide a great insight into the mechanical behavior of the organs, since they are based on the biomechanical material properties, complex geometry of organs, and anatomical boundary conditions. In this paper we present an original approach that offers the possibility to combine image-based biomechanical models with particle transport simulations. We propose a new method to map material density information issued from CT images to deformable tetrahedral meshes. Based on the principle of mass conservation our method can correlate density variation of organ tissues with geometrical deformations during the different phases of the respiratory cycle. The first results are particularly encouraging, as local error quantification of density mapping on organ geometry and density variation with organ motion are performed to evaluate and validate our approach.

Keywords: Biomechanical simulation, dose distribution, image guided radiation therapy, organ motion, tetrahedral mesh, 4D-CT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3008
26 Application of Neural Network in User Authentication for Smart Home System

Authors: A. Joseph, D.B.L. Bong, D.A.A. Mat

Abstract:

Security has been an important issue and concern in the smart home systems. Smart home networks consist of a wide range of wired or wireless devices, there is possibility that illegal access to some restricted data or devices may happen. Password-based authentication is widely used to identify authorize users, because this method is cheap, easy and quite accurate. In this paper, a neural network is trained to store the passwords instead of using verification table. This method is useful in solving security problems that happened in some authentication system. The conventional way to train the network using Backpropagation (BPN) requires a long training time. Hence, a faster training algorithm, Resilient Backpropagation (RPROP) is embedded to the MLPs Neural Network to accelerate the training process. For the Data Part, 200 sets of UserID and Passwords were created and encoded into binary as the input. The simulation had been carried out to evaluate the performance for different number of hidden neurons and combination of transfer functions. Mean Square Error (MSE), training time and number of epochs are used to determine the network performance. From the results obtained, using Tansig and Purelin in hidden and output layer and 250 hidden neurons gave the better performance. As a result, a password-based user authentication system for smart home by using neural network had been developed successfully.

Keywords: Neural Network, User Authentication, Smart Home, Security

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
25 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1056
24 Optimization of Two Quality Characteristics in Injection Molding Processes via Taguchi Methodology

Authors: Joseph C. Chen, Venkata Karthik Jakka

Abstract:

The main objective of this research is to optimize tensile strength and dimensional accuracy in injection molding processes using Taguchi Parameter Design. An L16 orthogonal array (OA) is used in Taguchi experimental design with five control factors at four levels each and with non-controllable factor vibration. A total of 32 experiments were designed to obtain the optimal parameter setting for the process. The optimal parameters identified for the shrinkage are shot volume, 1.7 cubic inch (A4); mold term temperature, 130 ºF (B1); hold pressure, 3200 Psi (C4); injection speed, 0.61 inch3/sec (D2); and hold time of 14 seconds (E2). The optimal parameters identified for the tensile strength are shot volume, 1.7 cubic inch (A4); mold temperature, 160 ºF (B4); hold pressure, 3100 Psi (C3); injection speed, 0.69 inch3/sec (D4); and hold time of 14 seconds (E2). The Taguchi-based optimization framework was systematically and successfully implemented to obtain an adjusted optimal setting in this research. The mean shrinkage of the confirmation runs is 0.0031%, and the tensile strength value was found to be 3148.1 psi. Both outcomes are far better results from the baseline, and defects have been further reduced in injection molding processes.

Keywords: Injection molding processes, Taguchi Parameter Design, tensile strength, shrinkage test, high-density polyethylene, HDPE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 840
23 Displacement Fields in Footing-Sand Interactions under Cyclic Loading

Authors: S. Joseph Antony, Z. K. Jahanger

Abstract:

Soils are subjected to cyclic loading in situ in situations such as during earthquakes and in the compaction of pavements. Investigations on the local scale measurement of the displacements of the grain and failure patterns within the soil bed under the cyclic loading conditions are rather limited. In this paper, using the digital particle image velocimetry (DPIV), local scale displacement fields of a dense sand medium interacting with a rigid footing are measured under the plane-strain condition for two commonly used types of cyclic loading, and the quasi-static loading condition for the purposes of comparison. From the displacement measurements of the grains, the failure envelopes of the sand media are also presented. The results show that, the ultimate cyclic bearing capacity (qultcyc) occurred corresponding to a relatively higher settlement value when compared with that of under the quasi-static loading. For the sand media under the cyclic loading conditions considered here, the displacement fields in the soil media occurred more widely in the horizontal direction and less deeper along the vertical direction when compared with that of under the quasi-static loading. The 'dead zone' in the sand grains beneath the footing is identified for all types of the loading conditions studied here. These grain-scale characteristics have implications on the resulting bulk bearing capacity of the sand media in footing-sand interaction problems.

Keywords: Cyclic loading, DPIV, settlement, soil-structure interactions, strip footing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 877
22 Growth and Anatomical Responses of Lycopersicon esculentum (Tomatoes) under Microgravity and Normal Gravity Conditions

Authors: Gbenga F. Akomolafe, Joseph Omojola, Ezekiel S. Joshua, Seyi C. Adediwura, Elijah T. Adesuji, Michael O. Odey, Oyinade A. Dedeke, Ayo H. Labulo

Abstract:

Microgravity is known to be a major abiotic stress in space which affects plants depending on the duration of exposure. In this work, tomatoes seeds were exposed to long hours of simulated microgravity condition using a one-axis clinostat. The seeds were sown on a 1.5% combination of plant nutrient and agar-agar solidified medium in three Petri dishes. One of the Petri dishes was mounted on the clinostat and allowed to rotate at the speed of 20 rpm for 72 hours, while the others were subjected to the normal gravity vector. The anatomical sections of both clinorotated and normal gravity plants were made after 72 hours and observed using a Phase-contrast digital microscope. The percentage germination, as well as the growth rate of the normal gravity seeds, was higher than the clinorotated ones. The germinated clinorotated roots followed different directions unlike the normal gravity ones which grew towards the direction of gravity vector. The clinostat was able to switch off gravistimulation. Distinct cellular arrangement was observed for tomatoes under normal gravity condition, unlike those of clinorotated ones. The root epidermis and cortex of normal gravity are thicker than the clinorotated ones. This implied that under long-term microgravity influence, plants do alter their anatomical features as a way of adapting to the stress condition.

Keywords: Anatomy, Clinostat, Germination, Microgravity, Lycopersicon esculentum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1041
21 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.

Keywords: Injection molding, shrinkage, six sigma, Taguchi parameter design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1381
20 Intensity Analysis to Link Changes in Land-Use Pattern in the Abuakwa North and South Municipalities, Ghana, from 1986 to 2017

Authors: Isaac Kwaku Adu, Jacob Doku Tetteh, John Joseph Puthenkalam, Kwabena Effah Antwi

Abstract:

The continuous increase in population implies increase in food demand. There is, therefore, the need to increase agricultural production and other forest products to ensure food security and economic development. This paper employs the three-level intensity analysis to assess the total change of land-use in two-time intervals (1986-2002 and 2002-2017), the net change and swap as well as gross gains and losses in the two intervals. The results revealed that the overall change in the 31-year period was greater in the second period (2002-2017). Agriculture and forest categories lost in the first period while the other land class gained. However, in the second period agriculture and built-up increased greatly while forest, water bodies and thick bushes/shrubland experienced loss. An assessment revealed a reduction of forest in both periods but was greater in the second period and expansion of agricultural land was recorded as population increases. The pixels gaining built-up targeted agricultural land in both intervals, it also targeted thick bushes/shrubland and waterbody in the second period only. Built-up avoided forest in both intervals as well as waterbody and thick bushes/shrubland. To help in developing the best land-use strategies/policies, a further validation of the social factors is necessary.

Keywords: Agricultural land-use, forest, intensity analysis, land-cover change, sustainable land-use.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 638
19 Automatic Road Network Recognition and Extraction for Urban Planning

Authors: D. B. L. Bong, K.C. Lai, A. Joseph

Abstract:

The uses of road map in daily activities are numerous but it is a hassle to construct and update a road map whenever there are changes. In Universiti Malaysia Sarawak, research on Automatic Road Extraction (ARE) was explored to solve the difficulties in updating road map. The research started with using Satellite Image (SI), or in short, the ARE-SI project. A Hybrid Simple Colour Space Segmentation & Edge Detection (Hybrid SCSS-EDGE) algorithm was developed to extract roads automatically from satellite-taken images. In order to extract the road network accurately, the satellite image must be analyzed prior to the extraction process. The characteristics of these elements are analyzed and consequently the relationships among them are determined. In this study, the road regions are extracted based on colour space elements and edge details of roads. Besides, edge detection method is applied to further filter out the non-road regions. The extracted road regions are validated by using a segmentation method. These results are valuable for building road map and detecting the changes of the existing road database. The proposed Hybrid Simple Colour Space Segmentation and Edge Detection (Hybrid SCSS-EDGE) algorithm can perform the tasks fully automatic, where the user only needs to input a high-resolution satellite image and wait for the result. Moreover, this system can work on complex road network and generate the extraction result in seconds.

Keywords: Road Network Recognition, Colour Space, Edge Detection, Urban Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2994
18 Application of Synthetic Monomers Grafted Xanthan Gum for Rhodamine B Removal in Aqueous Solution

Authors: T. Moremedi, L. Katata-Seru, S. Sardar, A. Bandyopadhyay, E. Makhado, M. Joseph Hato

Abstract:

The rapid industrialisation and population growth have led to a steady fall in freshwater supplies worldwide. As a result, water systems are affected by modern methods upon use due to secondary contamination. The application of novel adsorbents derived from natural polymer holds a great promise in addressing challenges in water treatment. In this study, the UV irradiation technique was used to prepare acrylamide (AAm) monomer, and acrylic acid (AA) monomer grafted xanthan gum (XG) copolymer. Furthermore, the factors affecting rhodamine B (RhB) adsorption from aqueous media, such as pH, dosage, concentration, and time were also investigated. The FTIR results confirmed the formation of graft copolymer by the strong vibrational bands at 1709 cm-1 and 1612 cm-1 for AA and AAm, respectively. Additionally, more irregular, porous and wrinkled surface observed from SEM of XG-g-AAm/AA indicated copolymerization interaction of monomers. The optimum conditions for removing RhB dye with a maximum adsorption capacity of 313 mg/g at 25 0C from aqueous solution were pH approximately 5, initial dye concentration = 200 ppm, adsorbent dose = 30 mg. Also, the detailed investigation of the isothermal and adsorption kinetics of RhB from aqueous solution showed that the adsorption of the dye followed a Freundlich model (R2 = 0.96333) and pseudo-second-order kinetics. The results further indicated that this absorbent based on XG had the universality to remove dye through the mechanism of chemical adsorption. The outstanding adsorption potential of the grafted copolymer could be used to remove cationic dyes from aqueous solution as a low-cost product.

Keywords: Xanthan gum, adsorbents, rhodamine B, Freundlich model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 661
17 Detecting Fake News: A Natural Language Processing, Reinforcement Learning, and Blockchain Approach

Authors: Ashly Joseph, Jithu Paulose

Abstract:

In an era where misleading information may quickly circulate on digital news channels, it is crucial to have efficient and trustworthy methods to detect and reduce the impact of misinformation. This research proposes an innovative framework that combines Natural Language Processing (NLP), Reinforcement Learning (RL), and Blockchain technologies to precisely detect and minimize the spread of false information in news articles on social media. The framework starts by gathering a variety of news items from different social media sites and performing preprocessing on the data to ensure its quality and uniformity. NLP methods are utilized to extract complete linguistic and semantic characteristics, effectively capturing the subtleties and contextual aspects of the language used. These features are utilized as input for a RL model. This model acquires the most effective tactics for detecting and mitigating the impact of false material by modeling the intricate dynamics of user engagements and incentives on social media platforms. The integration of blockchain technology establishes a decentralized and transparent method for storing and verifying the accuracy of information. The Blockchain component guarantees the unchangeability and safety of verified news records, while encouraging user engagement for detecting and fighting false information through an incentive system based on tokens. The suggested framework seeks to provide a thorough and resilient solution to the problems presented by misinformation in social media articles.

Keywords: Natural Language Processing, Reinforcement Learning, Blockchain, fake news mitigation, misinformation detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87
16 Policy Brief/Note of Philippine Health Issues: Human Rights Violations Committed on Healthcare Workers

Authors: Trina Isabel D. Santiago, Daniel C. Chua, Jumee F. Tayaban, Joseph Daniel S. Timbol, Joshua M. Yanes

Abstract:

Numerous instances of human rights violations on healthcare workers have been reported during the COVID-19 pandemic in the Philippines. This paper aims to explore these civil and political rights violations and propose recommendations to address these. Our review shows that a wide range of civic and political human rights violations have been committed by individual citizens and government agencies on individual healthcare workers and health worker groups. These violations include discrimination, red-tagging, evictions, illegal arrests, and acts of violence ranging from chemical attacks to homicide. If left unchecked, these issues, compounded by the pandemic, may lead to the exacerbations of the pre-existing problems of the Philippine healthcare system. Despite all pre-existing reports by human rights groups and public media articles, there still seems to be a lack of government action to condemn and prevent these violations. The existence of government agencies which directly contribute to these violations with the lack of condemnation from other agencies further propagate the problem. Given these issues, this policy brief recommends the establishment of an interagency task force for the protection of human rights of healthcare workers as well as the expedited passing of current legislative bills towards the same goal. For more immediate action, we call for the establishment of a dedicated hotline for these incidents with adequate appointment and training of point persons, construction of clear guidelines, and closer collaboration between government agencies in being united against these issues.

Keywords: COVID-19 pandemic, healthcare workers, human rights violations, Philippines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84
15 Quality of Groundwater in the Shallow Aquifers of a Paddy Dominated Agricultural River Basin, Kerala, India

Authors: N. Kannan, Sabu Joseph

Abstract:

Groundwater is an essential and vital component of our life support system. The groundwater resources are being utilized for drinking, irrigation and industrial purposes. There is growing concern on deterioration of groundwater quality due to geogenic and anthropogenic activities. Groundwater, being a fragile must be carefully managed to maintain its purity within standard limits. So, quality assessment and management are to be carried out hand-in-hand to have a pollution free environment and for a sustainable use. In order to assess the quality for consumption by human beings and for use in agriculture, the groundwater from the shallow aquifers (dug well) in the Palakkad and Chittur taluks of Bharathapuzha river basin - a paddy dominated agricultural basin (order=8th; L= 209 Km; Area = 6186 Km2), Kerala, India, has been selected. The water samples (n= 120) collected for various seasons, viz., monsoon-MON (August, 2005), postmonsoon-POM (December, 2005) and premonsoon-PRM (April, 2006), were analyzed for important physico-chemical attributes. Spatial and temporal variation of attributes do exist in the study area, and based on major cations and anions, different hydrochemical facies have been identified. Using Gibbs'diagram, rock dominance has been identified as the mechanism controlling groundwater chemistry. Further, the suitability of water for irrigation was determined by analyzing salinity hazard indicated by sodium adsorption ratio (SAR), residual sodium carbonate (RSC) and sodium percent (%Na). Finally, stress zones in the study area were delineated using Arc GIS spatial analysis and various management options were recommended to restore the ecosystem.

Keywords: Groundwater quality, agricultural basin, Kerala, India.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2598
14 Wavelet Based Qualitative Assessment of Femur Bone Strength Using Radiographic Imaging

Authors: Sundararajan Sangeetha, Joseph Jesu Christopher, Swaminathan Ramakrishnan

Abstract:

In this work, the primary compressive strength components of human femur trabecular bone are qualitatively assessed using image processing and wavelet analysis. The Primary Compressive (PC) component in planar radiographic femur trabecular images (N=50) is delineated by semi-automatic image processing procedure. Auto threshold binarization algorithm is employed to recognize the presence of mineralization in the digitized images. The qualitative parameters such as apparent mineralization and total area associated with the PC region are derived for normal and abnormal images.The two-dimensional discrete wavelet transforms are utilized to obtain appropriate features that quantify texture changes in medical images .The normal and abnormal samples of the human femur are comprehensively analyzed using Harr wavelet.The six statistical parameters such as mean, median, mode, standard deviation, mean absolute deviation and median absolute deviation are derived at level 4 decomposition for both approximation and horizontal wavelet coefficients. The correlation coefficient of various wavelet derived parameters with normal and abnormal for both approximated and horizontal coefficients are estimated. It is seen that in almost all cases the abnormal show higher degree of correlation than normals. Further the parameters derived from approximation coefficient show more correlation than those derived from the horizontal coefficients. The parameters mean and median computed at the output of level 4 Harr wavelet channel was found to be a useful predictor to delineate the normal and the abnormal groups.

Keywords: Image processing, planar radiographs, trabecular bone and wavelet analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
13 Data Privacy and Safety with Large Language Models

Authors: Ashly Joseph, Jithu Paulose

Abstract:

Large language models (LLMs) have revolutionized natural language processing capabilities, enabling applications such as chatbots, dialogue agents, image, and video generators. Nevertheless, their trainings on extensive datasets comprising personal information poses notable privacy and safety hazards. This study examines methods for addressing these challenges, specifically focusing on approaches to enhance the security of LLM outputs, safeguard user privacy, and adhere to data protection rules. We explore several methods including post-processing detection algorithms, content filtering, reinforcement learning from human and AI inputs, and the difficulties in maintaining a balance between model safety and performance. The study also emphasizes the dangers of unintentional data leakage, privacy issues related to user prompts, and the possibility of data breaches. We highlight the significance of corporate data governance rules and optimal methods for engaging with chatbots. In addition, we analyze the development of data protection frameworks, evaluate the adherence of LLMs to General Data Protection Regulation (GDPR), and examine privacy legislation in academic and business policies. We demonstrate the difficulties and remedies involved in preserving data privacy and security in the age of sophisticated artificial intelligence by employing case studies and real-life instances. This article seeks to educate stakeholders on practical strategies for improving the security and privacy of LLMs, while also assuring their responsible and ethical implementation.

Keywords: Data privacy, large language models, artificial intelligence, machine learning, cybersecurity, general data protection regulation, data safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 106
12 Application of Particle Image Velocimetry in the Analysis of Scale Effects in Granular Soil

Authors: Zuhair Kadhim Jahanger, S. Joseph Antony

Abstract:

The available studies in the literature which dealt with the scale effects of strip footings on different sand packing systematically still remain scarce. In this research, the variation of ultimate bearing capacity and deformation pattern of soil beneath strip footings of different widths under plane-strain condition on the surface of loose, medium-dense and dense sand have been systematically studied using experimental and noninvasive methods for measuring microscopic deformations. The presented analyses are based on model scale compression test analysed using Particle Image Velocimetry (PIV) technique. Upper bound analysis of the current study shows that the maximum vertical displacement of the sand under the ultimate load increases for an increase in the width of footing, but at a decreasing rate with relative density of sand, whereas the relative vertical displacement in the sand decreases for an increase in the width of the footing. A well agreement is observed between experimental results for different footing widths and relative densities. The experimental analyses have shown that there exists pronounced scale effect for strip surface footing. The bearing capacity factors rapidly decrease up to footing widths B=0.25 m, 0.35 m, and 0.65 m for loose, medium-dense and dense sand respectively, after that there is no significant decrease in . The deformation modes of the soil as well as the ultimate bearing capacity values have been affected by the footing widths. The obtained results could be used to improve settlement calculation of the foundation interacting with granular soil.

Keywords: PIV, granular mechanics, scale effect, upper bound analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1009
11 Density of Hydrocarbonoclastic Bacteria and Polycyclic Aromatic Hydrocarbon Accumulation in Iko River Mangrove Ecosystem, Nigeria

Authors: Ime R. Udotong, Samuel I. Eduok, Joseph P. Essien, Basil N. Ita

Abstract:

Sediment and mangrove root samples from Iko River Estuary, Nigeria were analyzed for microbial and polycyclic aromatic hydrocarbon (PAH) content. The total heterotrophic bacterial (THB) count ranged from 1.1x107 to 5.1 x107 cfu/g, total fungal (TF) count ranged from 1.0x106 to 2.7x106 cfu/g, total coliform (TC) count ranged from 2.0x104 to 8.0x104cfu/g while hydrocarbon utilizing bacterial (HUB) count ranged from 1.0x 105 to 5.0 x 105cfu/g. There was a range of positive correlation (r = 0.72 to 0.93) between THB count and total HUB count, respectively. The organisms were Staphylococcus aureus, Bacillus cereus, Flavobacterium breve, Pseudomonas aeruginosa, Erwinia amylovora, Escherichia coli, Enterobacter sp, Desulfovibrio sp, Acinetobacter iwoffii, Chromobacterium violaceum, Micrococcus sedentarius, Corynebacterium sp, and Pseudomonas putrefaciens. The PAH were Naphthalene, 2-Methylnaphthalene, Acenapthylene, Acenaphthene, Fluorene, Phenanthene, Anthracene, Fluoranthene, Pyrene, Benzo(a)anthracene, Chrysene, Benzo(b)fluoranthene, Benzo(k)fluoranthene, Benzo(a)pyrene, Dibenzo(a,h)anthracene, Benzo(g,h,l)perylene ,Indeno(1,2,3-d)pyrene with individual PAH concentrations that ranged from 0.20mg/kg to 1.02mg/kg, 0.20mg/kg to 1.07mg/kg and 0.2mg/kg to 4.43mg/kg in the benthic sediment, epipellic sediment and mangrove roots, respectively. Total PAH ranged from 6.30 to 9.93mg/kg, 6.30 to 9.13mg/kg and 9.66 to 16.68mg/kg in the benthic sediment, epipellic sediment and mangrove roots, respectively. The high concentrations in the mangrove roots are indicative of bioaccumulation of the pollutant in the plant tissue. The microorganisms are of ecological significance and the detectable quantities of polycyclic aromatic hydrocarbon could be partitioned and accumulated in tissues of infaunal and epifaunal organisms in the study area.

Keywords: Hydrocarbonoclastic bacteria, Iko River estuary, Mangrove, Polycyclic aromatic hydrocarbon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2736
10 IT Systems of the US Federal Courts, Justice, and Governance

Authors: Joseph Zernik

Abstract:

Validity, integrity, and impacts of the IT systems of the US federal courts have been studied as part of the Human Rights Alert-NGO (HRA) submission for the 2015 Universal Periodic Review (UPR) of human rights in the United States by the Human Rights Council (HRC) of the United Nations (UN). The current report includes overview of IT system analysis, data-mining and case studies. System analysis and data-mining show: Development and implementation with no lawful authority, servers of unverified identity, invalidity in implementation of electronic signatures, authentication instruments and procedures, authorities and permissions; discrimination in access against the public and unrepresented (pro se) parties and in favor of attorneys; widespread publication of invalid judicial records and dockets, leading to their false representation and false enforcement. A series of case studies documents the impacts on individuals' human rights, on banking regulation, and on international matters. Significance is discussed in the context of various media and expert reports, which opine unprecedented corruption of the US justice system today, and which question, whether the US Constitution was in fact suspended. Similar findings were previously reported in IT systems of the State of California and the State of Israel, which were incorporated, subject to professional HRC staff review, into the UN UPR reports (2010 and 2013). Solutions are proposed, based on the principles of publicity of the law and the separation of power: Reliance on US IT and legal experts under accountability to the legislative branch, enhancing transparency, ongoing vigilance by human rights and internet activists. IT experts should assume more prominent civic duties in the safeguard of civil society in our era.

Keywords: E-justice, federal courts, United States, human rights, banking regulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149
9 The Use of Software and Internet Search Engines to Develop the Encoding and Decoding Skills of a Dyslexic Learner: A Case Study

Authors: Rabih Joseph Nabhan

Abstract:

This case study explores the impact of two major computer software programs Learn to Speak English and Learn English Spelling and Pronunciation, and some Internet search engines such as Google on mending the decoding and spelling deficiency of Simon X, a dyslexic student. The improvement in decoding and spelling may result in better reading comprehension and composition writing. Some computer programs and Internet materials can help regain the missing awareness and consequently restore his self-confidence and self-esteem. In addition, this study provides a systematic plan comprising a set of activities (four computer programs and Internet materials) which address the problem from the lowest to the highest levels of phoneme and phonological awareness. Four methods of data collection (accounts, observations, published tests, and interviews) create the triangulation to validly and reliably collect data before the plan, during the plan, and after the plan. The data collected are analyzed quantitatively and qualitatively. Sometimes the analysis is either quantitative or qualitative, and some other times a combination of both. Tables and figures are utilized to provide a clear and uncomplicated illustration of some data. The improvement in the decoding, spelling, reading comprehension, and composition writing skills that occurred is proved through the use of authentic materials performed by the student under study. Such materials are a comparison between two sample passages written by the learner before and after the plan, a genuine computer chat conversation, and the scores of the academic year that followed the execution of the plan. Based on these results, the researcher recommends further studies on other Lebanese dyslexic learners using the computer to mend their language problem in order to design and make a most reliable software program that can address this disability more efficiently and successfully.

Keywords: Analysis, awareness, dyslexic, software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645
8 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment

Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto

Abstract:

Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.

Keywords: Carbon stock, forest inventory, LiDAR, tree count.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281
7 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations

Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li

Abstract:

The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.

Keywords: CNC milling, CNC turning, surface roughness, Taguchi analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 750
6 Determination of Some Organochlorine Pesticide Residues in Vegetable and Soil Samples from Alau Dam and Gongulong Agricultural Sites, Borno State, North Eastern Nigeria

Authors: Joseph Clement Akan, Lami Jafiya, Zaynab Muhammad Chellube, Zakari Mohammed, Fanna Inna Abdulrahman

Abstract:

Five vegetables (spinach, lettuce, cabbage, tomato, and onion) were freshly harvested from the Alau Dam and Gongulong agricultural areas for the determination of some organochlorine pesticide residues (o, p-DDE, p,p’-DDD, o,p’-DDD, p,p’-DDT, α-BHC, γ-BHC, metoxichlor, lindane, endosulfan dieldrin, and aldrin.) Soil samples were also collected at different depths for the determination of the above pesticides. Samples collection and preparation were conducted using standard procedures. The concentrations of all the pesticides in the soil and vegetable samples were determined using GC/MS SHIMADZU (GC-17A) equipped with electron capture detector (ECD). The highest concentration was that of p,p’-DDD (132.4±13.45µg/g) which was observed in the leaf of cabbage, while the lowest concentration was that of p,p’-DDT (2.34µg/g) was observed in the root of spinach. Similar trends were observed at the Gongulong agricultural area, with p,p’-DDD having the highest concentration of 153.23µg/g in the leaf of cabbage, while the lowest concentration was that of p,p’-DDT (12.45µg/g) which was observed in the root of spinach. α-BHC, γ-BHC, Methoxychlor, and lindane were detected in all the vegetable samples studied. The concentrations of all the pesticides in the soil samples were observed to be higher at a depth of 21-30cm, while the lowest concentrations were observed at a depth of 0-10cm. The concentrations of all the pesticides in the vegetables and soil samples from the two agricultural sites were observed to be at alarming levels, much higher than the maximum residue limits (MRLs) and acceptable daily intake values (ADIs) .The levels of the pesticides observed in the vegetables and soil samples investigated, are of such a magnitude that calls for special attention and laws to regulate the use and circulation of such chemicals. Routine monitoring of pesticide residues in these study areas is necessary for the prevention, control and reduction of environmental pollution, so as to minimize health risks.

Keywords: Alau Dam, Gongulong, Organochlorine, Pesticide Residues, Soil, Vegetables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3476
5 Design Development of Floating Performance Structure for Coastal Areas in the Maltese Islands

Authors: Rebecca E. Dalli Gonzi, Joseph Falzon

Abstract:

Background: Islands in the Mediterranean region offer opportunities for various industries to take advantage of the facilitation and use of versatile floating structures in coastal areas. In the context of dense land use, marine structures can contribute to ensure both terrestrial and marine resource sustainability. Objective: The aim of this paper is to present and critically discuss an array of issues that characterize the design process of a floating structure for coastal areas and to present the challenges and opportunities of providing such multifunctional and versatile structures around the Maltese coastline. Research Design: A three-tier research design commenced with a systematic literature review. Semi-structured interviews with stakeholders including a naval architect, a marine engineer and civil designers were conducted. A second stage preceded a focus group with stakeholders in design and construction of marine lightweight structures. The three tier research design ensured triangulation of issues. All phases of the study were governed by research ethics. Findings: Findings were grouped into three main themes: excellence, impact and implementation. These included design considerations, applications and potential impacts on local industry. Literature for the design and construction of marine structures in the Maltese Islands presented multiple gaps in the application of marine structures for local industries. Weather conditions, depth of sea bed and wave actions presented limitations on the design capabilities of the structure. Conclusion: Water structures offer great potential and conclusions demonstrate the applicability of such designs for Maltese waters. There is still no such provision within Maltese coastal areas for multi-purpose use. The introduction of such facilities presents a range of benefits for visiting tourists and locals thereby offering wide range of services to tourism and marine industry. Costs for construction and adverse weather conditions were amongst the main limitations that shaped design capacities of the water structures.

Keywords: Coastal areas, lightweight, marine structure, multipurpose, versatile, floating device.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 941
4 Specification Requirements for a Combined Dehumidifier/Cooling Panel: A Global Scale Analysis

Authors: Damien Gondre, Hatem Ben Maad, Abdelkrim Trabelsi, Frédéric Kuznik, Joseph Virgone

Abstract:

The use of a radiant cooling solution would enable to lower cooling needs which is of great interest when the demand is initially high (hot climate). But, radiant systems are not naturally compatibles with humid climates since a low-temperature surface leads to condensation risks as soon as the surface temperature is close to or lower than the dew point temperature. A radiant cooling system combined to a dehumidification system would enable to remove humidity for the space, thereby lowering the dew point temperature. The humidity removal needs to be especially effective near the cooled surface. This requirement could be fulfilled by a system using a single desiccant fluid for the removal of both excessive heat and moisture. This task aims at providing an estimation of the specification requirements of such system in terms of cooling power and dehumidification rate required to fulfill comfort issues and to prevent any condensation risk on the cool panel surface. The present paper develops a preliminary study on the specification requirements, performances and behavior of a combined dehumidifier/cooling ceiling panel for different operating conditions. This study has been carried using the TRNSYS software which allows nodal calculations of thermal systems. It consists of the dynamic modeling of heat and vapor balances of a 5m x 3m x 2.7m office space. In a first design estimation, this room is equipped with an ideal heating, cooling, humidification and dehumidification system so that the room temperature is always maintained in between 21C and 25C with a relative humidity in between 40% and 60%. The room is also equipped with a ventilation system that includes a heat recovery heat exchanger and another heat exchanger connected to a heat sink. Main results show that the system should be designed to meet a cooling power of 42W.m−2 and a desiccant rate of 45 gH2O.h−1. In a second time, a parametric study of comfort issues and system performances has been achieved on a more realistic system (that includes a chilled ceiling) under different operating conditions. It enables an estimation of an acceptable range of operating conditions. This preliminary study is intended to provide useful information for the system design.

Keywords: Dehumidification, nodal calculation, radiant cooling panel, system sizing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 732
3 Developing a Practice Guideline for Enhancing Communication in Hearing Families with Deaf Children

Authors: Nomataru P. Gontse, Lavanithum Joseph

Abstract:

Deafness coupled with a lack of support and resources in developing countries poses a serious threat to the well- being of children. The mismatch between the needs of persons with disabilities and the resources available to them is a key factor in service provision in resource constrained contexts. Furthermore, deafness in children is the most common childhood sensory disorder in developing countries, and as such seriously affected with regard to resource constraints. This paper discusses the issues and research protocol for a Ph.D. study that aims to develop a practice guideline that is contextually sensitive and includes an interdisciplinary approach that will improve the outcomes of learners and the relationships in hearing households with deaf learners in rural areas of the Eastern Cape, one of the poorest provinces in South Africa. The guideline developed will consider the lived experiences of deaf children and their hearing families on the impact deafness has on their relationships and communication at home. Ethical clearance for the study has been obtained. The methodology is a mixed-methods approach in the form of a survey using questionnaires and semi-structured interviews with deaf learners in primary and high school and their hearing parents to get their perspective on the impact deafness has on their relationships and communication at home. The study is conducted using adolescent learners from Grades 7 to 12 (excluding learners younger than 12 years and older than 21 years). An audiologist, teachers, and support staff will also give their views on how the intervention is currently done and possible suggestions on how management can be done differently. Data collection will be conducted in isiXhosa by the researcher, as isiXhosa is dominant in this region. The interviews will be conducted in South African Sign Language by the sign language interpreter for deaf learners and educational professionals. An expected outcome for this study is the development of recommendations and a practice guideline for deaf children diagnosed late from rural or under-resourced environments. To ensure the implementation of the findings, in the end, professionals will be given feedback on the outcomes of the study so that they can identify areas within their practices that require updated knowledge. The developed guideline is expected to have an impact on the Department of Education policies both regionally and nationally, providing recommendations for a strategic management plan and practice guidelines for this vulnerable and marginalized population. The IsiXhosa specific context could be generalized to other similar contexts.

Keywords: Deafness, family-centred approach, early identification, rural communities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 382
2 Utilization of Process Mapping Tool to Enhance Production Drilling in Underground Metal Mining Operations

Authors: Sidharth Talan, Sanjay Kumar Sharma, Eoin Joseph Wallace, Nikita Agrawal

Abstract:

Underground mining is at the core of rapidly evolving metals and minerals sector due to the increasing mineral consumption globally. Even though the surface mines are still more abundant on earth, the scales of industry are slowly tipping towards underground mining due to rising depth and complexities of orebodies. Thus, the efficient and productive functioning of underground operations depends significantly on the synchronized performance of key elements such as operating site, mining equipment, manpower and mine services. Production drilling is the process of conducting long hole drilling for the purpose of charging and blasting these holes for the production of ore in underground metal mines. Thus, production drilling is the crucial segment in the underground metal mining value chain. This paper presents the process mapping tool to evaluate the production drilling process in the underground metal mining operation by dividing the given process into three segments namely Input, Process and Output. The three segments are further segregated into factors and sub-factors. As per the study, the major input factors crucial for the efficient functioning of production drilling process are power, drilling water, geotechnical support of the drilling site, skilled drilling operators, services installation crew, oils and drill accessories for drilling machine, survey markings at drill site, proper housekeeping, regular maintenance of drill machine, suitable transportation for reaching the drilling site and finally proper ventilation. The major outputs for the production drilling process are ore, waste as a result of dilution, timely reporting and investigation of unsafe practices, optimized process time and finally well fragmented blasted material within specifications set by the mining company. The paper also exhibits the drilling loss matrix, which is utilized to appraise the loss in planned production meters per day in a mine on account of availability loss in the machine due to breakdowns, underutilization of the machine and productivity loss in the machine measured in drilling meters per unit of percussion hour with respect to its planned productivity for the day. The given three losses would be essential to detect the bottlenecks in the process map of production drilling operation so as to instigate the action plan to suppress or prevent the causes leading to the operational performance deficiency. The given tool is beneficial to mine management to focus on the critical factors negatively impacting the production drilling operation and design necessary operational and maintenance strategies to mitigate them. 

Keywords: Process map, drilling loss matrix, availability, utilization, productivity, percussion rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089