Search results for: process security
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17246

Search results for: process security

15686 Two Step Biodiesel Production from High Free Fatty Acid Spent Bleaching Earth

Authors: Rajiv Arora

Abstract:

Biodiesel may be economical if produced from inexpensive feedstock which commonly contains high level of free fatty acids (FFA) as an inhibitor in production of methyl ester. In this study, a two-step process for biodiesel production from high FFA spent bleach earth oil in a batch reactor is developed. Oil sample extracted from spent bleaching earth (SBE) was utilized for biodiesel process. In the first step, FFA of the SBE oil was reduced to 1.91% through sulfuric acid catalyzed esterification. In the second step, the product prepared from the first esterification process was carried out transesterification with an alkaline catalyst. The influence of four variables on conversion efficiency to methyl ester, i.e., methanol/ SBE oil molar ratio, catalyst amount, reaction temperature and reaction time, was studied in the second stage. The optimum process variables in the transesterification were methanol/oil molar ratio 6:1, heterogeneous catalyst conc. 5 wt %, reaction temperature 65 °C and reaction time 60 minutes to produce biodiesel. Major fuel properties of SBE biodiesel were measured to comply with ASTM and EN standards. Therefore, an optimized process for production of biodiesel from a low-cost high FFA source was accomplished.

Keywords: biodiesel, esterification, free fatty acids, residual oil, spent bleaching earth, transesterification

Procedia PDF Downloads 167
15685 Evaluation of Medication Administration Process in a Paediatric Ward

Authors: Zayed Alsulami, Asma Aldosseri, Ahmed Ezziden, Abdulrahman Alonazi

Abstract:

Children are more susceptible to medication errors than adults. Medication administration process is the last stage in the medication treatment process and most of the errors detected in this stage. Little research has been undertaken about medication errors in children in the Middle East countries. This study was aimed to evaluate how the paediatric nurses adhere to the medication administration policy and also to identify any medication preparation and administration errors or any risk factors. An observational, prospective study of medication administration process from when the nurses preparing patient medication until administration stage (May to August 2014) was conducted in Saudi Arabia. Twelve paediatric nurses serving 90 paediatric patients were observed. 456 drug administered doses were evaluated. Adherence rate was variable in 7 steps out of 16 steps. Patient allergy information, dose calculation, drug expiry date were the steps in medication administration with lowest adherence rates. 63 medication preparation and administration errors were identified with error rate 13.8% of medication administrations. No potentially life-threating errors were witnessed. Few logistic and administrative factors were reported. The results showed that the medication administration policy and procedure need an urgent revision to be more sensible for nurses in practice. Nurses’ knowledge and skills regarding the medication administration process should be improved.

Keywords: medication sasfety, paediatric, medication errors, paediatric ward

Procedia PDF Downloads 384
15684 Facial Recognition of University Entrance Exam Candidates using FaceMatch Software in Iran

Authors: Mahshid Arabi

Abstract:

In recent years, remarkable advancements in the fields of artificial intelligence and machine learning have led to the development of facial recognition technologies. These technologies are now employed in a wide range of applications, including security, surveillance, healthcare, and education. In the field of education, the identification of university entrance exam candidates has been one of the fundamental challenges. Traditional methods such as using ID cards and handwritten signatures are not only inefficient and prone to fraud but also susceptible to errors. In this context, utilizing advanced technologies like facial recognition can be an effective and efficient solution to increase the accuracy and reliability of identity verification in entrance exams. This article examines the use of FaceMatch software for recognizing the faces of university entrance exam candidates in Iran. The main objective of this research is to evaluate the efficiency and accuracy of FaceMatch software in identifying university entrance exam candidates to prevent fraud and ensure the authenticity of individuals' identities. Additionally, this research investigates the advantages and challenges of using this technology in Iran's educational systems. This research was conducted using an experimental method and random sampling. In this study, 1000 university entrance exam candidates in Iran were selected as samples. The facial images of these candidates were processed and analyzed using FaceMatch software. The software's accuracy and efficiency were evaluated using various metrics, including accuracy rate, error rate, and processing time. The research results indicated that FaceMatch software could accurately identify candidates with a precision of 98.5%. The software's error rate was less than 1.5%, demonstrating its high efficiency in facial recognition. Additionally, the average processing time for each candidate's image was less than 2 seconds, indicating the software's high efficiency. Statistical evaluation of the results using precise statistical tests, including analysis of variance (ANOVA) and t-test, showed that the observed differences were significant, and the software's accuracy in identity verification is high. The findings of this research suggest that FaceMatch software can be effectively used as a tool for identifying university entrance exam candidates in Iran. This technology not only enhances security and prevents fraud but also simplifies and streamlines the exam administration process. However, challenges such as preserving candidates' privacy and the costs of implementation must also be considered. The use of facial recognition technology with FaceMatch software in Iran's educational systems can be an effective solution for preventing fraud and ensuring the authenticity of university entrance exam candidates' identities. Given the promising results of this research, it is recommended that this technology be more widely implemented and utilized in the country's educational systems.

Keywords: facial recognition, FaceMatch software, Iran, university entrance exam

Procedia PDF Downloads 31
15683 Cross Country Comparison: Business Process Management Maturity, Social Business Process Management and Organizational Culture

Authors: Dalia Suša Vugec

Abstract:

In recent few decades, business process management (BPM) has been in focus of a great number of researchers and organizations. There are many benefits derived from the implementation of BPM in organizations. However, there has been also noticed that lately traditional BPM faces some difficulties in terms of the divide between models and their execution, lost innovations, lack of information fusioning and so on. As a result, there has been a new discipline, called social BPM, which incorporates principles of social software into the BPM. On the other hand, many researchers indicate organizational culture as a vital part of the BPM success and maturity. Therefore, the goal of this study is to investigate the current state of BPM maturity and the usage of social BPM among the organizations from Croatia, Slovenia and Austria, with the regards to the organizational culture as well. The paper presents the results of a survey conducted as part of the PROSPER project (IP-2014-09-3729), financed by Croatian Science Foundation. The results indicate differences in the level of BPM maturity, the usage of social BPM and the dominant organizational culture in the observed organizations from different countries. These differences are further discussed in the paper.

Keywords: business process management, BPM maturity, organizational culture, social BPM

Procedia PDF Downloads 166
15682 A Technique for Image Segmentation Using K-Means Clustering Classification

Authors: Sadia Basar, Naila Habib, Awais Adnan

Abstract:

The paper presents the Technique for Image Segmentation Using K-Means Clustering Classification. The presented algorithms were specific, however, missed the neighboring information and required high-speed computerized machines to run the segmentation algorithms. Clustering is the process of partitioning a group of data points into a small number of clusters. The proposed method is content-aware and feature extraction method which is able to run on low-end computerized machines, simple algorithm, required low-quality streaming, efficient and used for security purpose. It has the capability to highlight the boundary and the object. At first, the user enters the data in the representation of the input. Then in the next step, the digital image is converted into groups clusters. Clusters are divided into many regions. The same categories with same features of clusters are assembled within a group and different clusters are placed in other groups. Finally, the clusters are combined with respect to similar features and then represented in the form of segments. The clustered image depicts the clear representation of the digital image in order to highlight the regions and boundaries of the image. At last, the final image is presented in the form of segments. All colors of the image are separated in clusters.

Keywords: clustering, image segmentation, K-means function, local and global minimum, region

Procedia PDF Downloads 364
15681 The Security Challenges of Urbanization and Environmental Degradation in the Niger-Delta Area of Nigeria

Authors: Gloria Ogungbade, Ogaba Oche, Moses Duruji, Chris Ehiobuche, Lady Ajayi

Abstract:

Human’s continued sustenance on earth and the quality of living are heavily dependent on the environment. The major components of the environment being air, water and land are the supporting pillars of the human existence, which they depend on directly or indirectly for survival and well-being. Unfortunately, due to some of the human activities on the environment, there seems to be a war between humans and the environment, which is evident in his over-exploitation and inadequate management of the basic components of the environment. Since the discovery of crude oil in the Niger Delta, the region has experienced various forms of degradation caused by pollution from oil spillage, gas flaring and other forms of environmental pollution, as a result of reckless way and manner with which oil is being exploited by the International Oil Corporations (IOCs) operating within the region. The Nigerian government on the other, not having strong regulations guiding the activities of the operations of these IOCs, has done almost nothing to curtail the activities of these IOCs because of the revenue generated the IOCs, as such the region is deprived of the basic social amenities and infrastructures. The degree of environmental pollution suffered within the region affects their major sources of livelihood – being fishing and farming, and has also left the region in poverty, which has led to a large number of people migrating to the urban areas to escape poverty. This paper investigates how environment degradation impact urbanization and security in the region.

Keywords: environmental degradation, environmental pollution, gas flaring, oil spillage, urbanization

Procedia PDF Downloads 273
15680 Biohydrogen and Potential Vinegar Production from Agricultural Wastes Using Thermotoga neopolitana

Authors: Nidhi Nalin

Abstract:

This study is theoretical modelling of the fermentation process of glucose in agricultural wastes like discarded peaches to produce hydrogen, acetic acid, and carbon dioxide using Thermotoga neopolitana bacteria. The hydrogen gas produced in this process can be used in hydrogen fuel cells to generate power, and the fermented broth with acetic acid and salts could be utilized as salty vinegar if enough acetic acid is produced. The theoretical modelling was done using SuperPro software, and the results indicated how much sugar (discarded peaches) is required to produce both hydrogen and vinegar for the process to be profitable.

Keywords: fermentation, thermotoga, hydrogen, vinegar, biofuel

Procedia PDF Downloads 147
15679 Evaluation of Deformable Boundary Condition Using Finite Element Method and Impact Test for Steel Tubes

Authors: Abed Ahmed, Mehrdad Asadi, Jennifer Martay

Abstract:

Stainless steel pipelines are crucial components to transportation and storage in the oil and gas industry. However, the rise of random attacks and vandalism on these pipes for their valuable transport has led to more security and protection for incoming surface impacts. These surface impacts can lead to large global deformations of the pipe and place the pipe under strain, causing the eventual failure of the pipeline. Therefore, understanding how these surface impact loads affect the pipes is vital to improving the pipes’ security and protection. In this study, experimental test and finite element analysis (FEA) have been carried out on EN3B stainless steel specimens to study the impact behaviour. Low velocity impact tests at 9 m/s with 16 kg dome impactor was used to simulate for high momentum impact for localised failure. FEA models of clamped and deformable boundaries were modelled to study the effect of the boundaries on the pipes impact behaviour on its impact resistance, using experimental and FEA approach. Comparison of experimental and FE simulation shows good correlation to the deformable boundaries in order to validate the robustness of the FE model to be implemented in pipe models with complex anisotropic structure.

Keywords: dynamic impact, deformable boundary conditions, finite element modelling, LS-DYNA, stainless steel pipe

Procedia PDF Downloads 140
15678 Malware Beaconing Detection by Mining Large-scale DNS Logs for Targeted Attack Identification

Authors: Andrii Shalaginov, Katrin Franke, Xiongwei Huang

Abstract:

One of the leading problems in Cyber Security today is the emergence of targeted attacks conducted by adversaries with access to sophisticated tools. These attacks usually steal senior level employee system privileges, in order to gain unauthorized access to confidential knowledge and valuable intellectual property. Malware used for initial compromise of the systems are sophisticated and may target zero-day vulnerabilities. In this work we utilize common behaviour of malware called ”beacon”, which implies that infected hosts communicate to Command and Control servers at regular intervals that have relatively small time variations. By analysing such beacon activity through passive network monitoring, it is possible to detect potential malware infections. So, we focus on time gaps as indicators of possible C2 activity in targeted enterprise networks. We represent DNS log files as a graph, whose vertices are destination domains and edges are timestamps. Then by using four periodicity detection algorithms for each pair of internal-external communications, we check timestamp sequences to identify the beacon activities. Finally, based on the graph structure, we infer the existence of other infected hosts and malicious domains enrolled in the attack activities.

Keywords: malware detection, network security, targeted attack, computational intelligence

Procedia PDF Downloads 250
15677 Determination of Economic and Ecological Potential of Bio Hydrogen Generated through Dark Photosynthesis Process

Authors: Johannes Full, Martin Reisinger, Alexander Sauer, Robert Miehe

Abstract:

The use of biogenic residues for the biotechnological production of chemical energy carriers for electricity and heat generation as well as for mobile applications is an important lever for the shift away from fossil fuels towards a carbon dioxide neutral post-fossil future. A multitude of promising biotechnological processes needs, therefore, to be compared against each other. For this purpose, a multi-objective target system and a corresponding methodology for the evaluation of the underlying key figures are presented in this paper, which can serve as a basis for decisionmaking for companies and promotional policy measures. The methodology considers in this paper the economic and ecological potential of bio-hydrogen production using the example of hydrogen production from fruit and milk production waste with the purple bacterium R. rubrum (so-called dark photosynthesis process) for the first time. The substrate used in this cost-effective and scalable process is fructose from waste material and waste deposits. Based on an estimation of the biomass potential of such fructose residues, the new methodology is used to compare different scenarios for the production and usage of bio-hydrogen through the considered process. In conclusion, this paper presents, at the example of the promising dark photosynthesis process, a methodology to evaluate the ecological and economic potential of biotechnological production of bio-hydrogen from residues and waste.

Keywords: biofuel, hydrogen, R. rubrum, bioenergy

Procedia PDF Downloads 186
15676 Opportunity Integrated Assessment Facilitating Critical Thinking and Science Process Skills Measurement on Acid Base Matter

Authors: Anggi Ristiyana Puspita Sari, Suyanta

Abstract:

To recognize the importance of the development of critical thinking and science process skills, the instrument should give attention to the characteristics of chemistry. Therefore, constructing an accurate instrument for measuring those skills is important. However, the integrated instrument assessment is limited in number. The purpose of this study is to validate an integrated assessment instrument for measuring students’ critical thinking and science process skills on acid base matter. The development model of the test instrument adapted McIntire model. The sample consisted of 392 second grade high school students in the academic year of 2015/2016 in Yogyakarta. Exploratory factor analysis (EFA) was conducted to explore construct validity, whereas content validity was substantiated by Aiken’s formula. The result shows that the KMO test is 0.714 which indicates sufficient items for each factor and the Bartlett test is significant (a significance value of less than 0.05). Furthermore, content validity coefficient which is based on 8 expert judgments is obtained at 0.85. The findings support the integrated assessment instrument to measure critical thinking and science process skills on acid base matter.

Keywords: acid base matter, critical thinking skills, integrated assessment instrument, science process skills, validity

Procedia PDF Downloads 312
15675 An Automated Business Process Management for Smart Medical Records

Authors: K. Malak, A. Nourah, S.Liyakathunisa

Abstract:

Nowadays, healthcare services are facing many challenges since they are becoming more complex and more needed. Every detail of a patient’s interactions with health care providers is maintained in Electronic Health Records (ECR) and Healthcare information systems (HIS). However, most of the existing systems are often focused on documenting what happens in manual health care process, rather than providing the highest quality patient care. Healthcare business processes and stakeholders can no longer rely on manual processes, to provide better patient care and efficient utilization of resources, Healthcare processes must be automated wherever it is possible. In this research, a detail survey and analysis is performed on the existing health care systems in Saudi Arabia, and an automated smart medical healthcare business process model is proposed. The business process management methods and rules are followed in discovering, collecting information, analysis, redesign, implementation and performance improvement analysis in terms of time and cost. From the simulation results, it is evident that our proposed smart medical records system can improve the quality of the service by reducing the time and cost and increasing efficiency

Keywords: business process management, electronic health records, efficiency, cost, time

Procedia PDF Downloads 328
15674 Optimization of Process Parameters in Wire Electrical Discharge Machining of Inconel X-750 for Dimensional Deviation Using Taguchi Technique

Authors: Mandeep Kumar, Hari Singh

Abstract:

The effective optimization of machining process parameters affects dramatically the cost and production time of machined components as well as the quality of the final products. This paper presents the optimization aspects of a Wire Electrical Discharge Machining operation using Inconel X-750 as work material. The objective considered in this study is minimization of the dimensional deviation. Six input process parameters of WEDM namely spark gap voltage, pulse-on time, pulse-off time, wire feed rate, peak current and wire tension, were chosen as variables to study the process performance. Taguchi's design of experiments methodology has been used for planning and designing the experiments. The analysis of variance was carried out for raw data as well as for signal to noise ratio. Four input parameters and one two-factor interaction have been found to be statistically significant for their effects on the response of interest. The confirmation experiments were also performed for validating the predicted results.

Keywords: ANOVA, DOE, inconel, machining, optimization

Procedia PDF Downloads 195
15673 Climate Change and Its Impact on Water Security and Health in Coastal Community: A Gender Outlook

Authors: Soorya Vennila

Abstract:

The present study answers the questions; how does climate change affect the water security in drought prone Ramanathapuram district? and what has water insecurity done to the health of the coastal community? The study area chosen is Devipattinam in Ramanathapuram district. Climate change evidentially wreaked havoc on the community with saltwater intrusion, water quality degradation, water scarcity and its eventual economic, social like power inequality within family and community and health hazards. The climatological data such as rainfall, minimum temperature and maximum temperature were statistically analyzed for trend using Mann-Kendall test. The test was conducted for 14 years (1989-2002) of rainfall data, maximum and minimum temperature and the data were statistically analyzed. At the outset, the water quality samples were collected from Devipattinam to test its physical and chemical parameters and their spatial variation. The results were derived as shown in ARC GIS. Using the water quality test water quality index were framed. And finally, key Informant interview, questionnaire were conducted to capture the gender perception and problem. The data collected were thereafter interpreted using SPSS software for recommendations and suggestions to overcome water scarcity and health problems.

Keywords: health, watersecurity, water quality, climate change

Procedia PDF Downloads 64
15672 A Range of Steel Production in Japan towards 2050

Authors: Reina Kawase

Abstract:

Japan set the goal of 80% reduction in GHG emissions by 2050. To consider countermeasures for reducing GHG emission, the production estimation of energy intensive materials, such as steel, is essential. About 50% of steel production is exported in Japan, so it is necessary to consider steel production including export. Steel productions from 2005-2050 in Japan were estimated under various global assumptions based on combination of scenarios such as goods trade scenarios and steel making process selection scenarios. Process selection scenarios decide volume of steel production by process (basic oxygen furnace and electric arc furnace) with considering steel consumption projection, supply-demand balance of steel, and scrap surplus. The range of steel production by process was analyzed. Maximum steel production was estimated under the scenario which consumes scrap in domestic steel production at maximum level. In 2035, steel production reaches 149 million ton because of increase in electric arc furnace steel. However, it decreases towards 2050 and amounts to 120 million ton, which is almost same as a current level. Minimum steel production is under the scenario which assumes technology progress in steel making and supply-demand balance consideration in each region. Steel production decreases from base year and is 44 million ton in 2050.

Keywords: goods trade scenario, steel making process selection scenario, steel production, global warming

Procedia PDF Downloads 373
15671 Product Development Process to Obtain Community Standard Product Certificate: A Case of Bangkhonthi, Samut Songkhram, Thailand

Authors: Supattra Pranee

Abstract:

The objectives of this research were to study the product development process to obtain a community standard product certificate and to set a guideline for the product development process to obtain the community product certificate. Focus group discussion was conducted with many experts in the field, local government officials, and representatives from local producers in Bangkontee district. The findings revealed that there were eight important processes to obtain the community product certificate: 1) prepare document, 2) submit the document, 3) set up an appointment for onsite inspection, 4) onsite inspection and sample collections, 5) evaluate samples, 6) obtain test result, and 7) obtain certificate.

Keywords: perceived values, tourist destination, visiting, product development

Procedia PDF Downloads 433
15670 A Glycerol-Free Process of Biodiesel Production through Chemical Interesterification of Jatropha Oil

Authors: Ratna Dewi Kusumaningtyas, Riris Pristiyani, Heny Dewajani

Abstract:

Biodiesel is commonly produced via the two main routes, i.e. the transesterification of triglycerides and the esterification of free fatty acid (FFA) using short-chain alcohols. Both the two routes have drawback in term of the side product yielded during the reaction. Transesterification reaction of triglyceride results in glycerol as side product. On the other hand, FFA esterification brings in water as side product. Both glycerol and water in the biodiesel production are managed as waste. Hence, a separation process is necessary to obtain a high purity biodiesel. Meanwhile, separation processes is generally the most capital and energy intensive part in industrial process. Therefore, to reduce the separation process, it is essential to produce biodiesel via an alternative route eliminating glycerol or water side-products. In this work, biodiesel synthesis was performed using a glycerol-free process through chemical interesterification of jatropha oil with ethyl acetate in the presence on sodium acetate catalyst. By using this method, triacetine, which is known as fuel bio-additive, is yielded instead of glycerol. This research studied the effects of catalyst concentration on the jatropha oil interesterification process in the range of 0.5 – 1.25% w/w oil. The reaction temperature and molar ratio of oil to ethyl acetate were varied at 50, 60, and 70°C, and 1:6, 1:9, 1:15, 1:30, and 1:60, respectively. The reaction time was evaluated from 0 to 8 hours. It was revealed that the best yield was obtained with the catalyst concentration of 0.5%, reaction temperature of 70 °C, molar ratio of oil to ethyl acetate at 1:60, at 6 hours reaction time.

Keywords: biodiesel, interesterification, glycerol-free, triacetine, jatropha oil

Procedia PDF Downloads 408
15669 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes

Authors: Misra Ayse Adsiz, Selim Selvi

Abstract:

In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.

Keywords: agile, design, missile, scrum

Procedia PDF Downloads 155
15668 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm

Procedia PDF Downloads 324
15667 Exploring the Social Factors of a Country that Influence International Migration: A Sociological Perspective

Authors: Md. Shahriar Sabuz

Abstract:

Different social factors influence individuals to migrate from their native lands. This qualitative study was designed to analyze the main social factors that have a significant role in the movement of people across borders. In this study, two research questions, i.e., ‘Which social factors of a country significantly influence the persons' decision to migrate from their homeland?’ and ’2: do different social factors of a country influence the process of international migration?" were formulated and relevant data were analyzed to get the logical answer to these two questions. Data analysis revealed that people migrate in large numbers due to deplorable and unsafe social conditions in their home countries. Sometimes migration occurs due to a lack of basic facilities in native countries. It is quite significant to know that these social conditions create a sense of deprivation and insecurity in individuals, and they move to other lands to get a sense of achievement and greater security for themselves and their whole families. This study is significant and distinct from previous studies in that it provides comprehensive information about the major social factors responsible for international migrations and their role in influencing an individual's proclivity to migrate. Besides this, it greatly opens new horizons of research and analysis for other researchers working on the agenda of international migration.

Keywords: International migration, social factors, income inequality, social discrimination

Procedia PDF Downloads 59
15666 Evaluation of the Integration of a Direct Reduction Process into an Existing Steel Mill

Authors: Nils Mueller, Gregor Herz, Erik Reichelt, Matthias Jahn

Abstract:

In the context of climate change, the reduction of greenhouse gas emissions in all economic sectors is considered to be an important factor in order to meet the demands of a sustainable energy system. The steel industry as one of the large industrial CO₂ emitters is currently highly dependent on fossil resources. In order to reduce coke consumption and thereby CO₂ emissions while still being able to further utilize existing blast furnaces, the possibility of including a direct reduction process (DRP) into a fully integrated steel mill was investigated. Therefore, a blast furnace model, derived from literature data and implemented in Aspen Plus, was used to analyze the impact of DRI in the blast furnace process. Furthermore, a state-of-the-art DRP was modeled to investigate the possibility of substituting the reducing agent natural gas with hydrogen. A sensitivity analysis was carried out in order to find the boundary percentage of hydrogen as a reducing agent without penalty to the DRI quality. Lastly, the two modeled process steps were combined to form a route of producing pig iron. By varying boundary conditions of the DRP while recording the CO₂ emissions of the two process steps, the overall potential for the reduction of CO₂ emissions was estimated. Within the simulated range, a maximum reduction of CO₂ emissions of 23.5% relative to typical emissions of a blast furnace could be determined.

Keywords: blast furnace, CO₂ mitigation, DRI, hydrogen

Procedia PDF Downloads 274
15665 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 60
15664 Household Earthquake Absorptive Capacity Impact on Food Security: A Case Study in Rural Costa Rica

Authors: Laura Rodríguez Amaya

Abstract:

The impact of natural disasters on food security can be devastating, especially in rural settings where livelihoods are closely tied to their productive assets. In hazards studies, absorptive capacity is seen as a threshold that impacts the degree of people’s recovery after a natural disaster. Increasing our understanding of households’ capacity to absorb natural disaster shocks can provide the international community with viable measurements for assessing at-risk communities’ resilience to food insecurities. The purpose of this study is to identify the most important factors in determining a household’s capacity to absorb the impact of a natural disaster. This is an empirical study conducted in six communities in Costa Rica affected by earthquakes. The Earthquake Impact Index was developed for the selection of the communities in this study. The households coded as total loss in the selected communities constituted the sampling frame from which the sample population was drawn. Because of the study area geographically dispersion over a large surface, the stratified clustered sampling hybrid technique was selected. Of the 302 households identified as total loss in the six communities, a total of 126 households were surveyed, constituting 42 percent of the sampling frame. A list of indicators compiled based on theoretical and exploratory grounds for the absorptive capacity construct served to guide the survey development. These indicators were included in the following variables: (1) use of informal safety nets, (2) Coping Strategy, (3) Physical Connectivity, and (4) Infrastructure Damage. A multivariate data analysis was conducted using Statistical Package for Social Sciences (SPSS). The results show that informal safety nets such as family and friends assistance exerted the greatest influence on the ability of households to absorb the impact of earthquakes. In conclusion, communities that experienced the highest environmental impact and human loss got disconnected from the social networks needed to absorb the shock’s impact. This resulted in higher levels of household food insecurity.

Keywords: absorptive capacity, earthquake, food security, rural

Procedia PDF Downloads 243
15663 Analysis of Process for Solution of Fiber-Ends after Biopolishing on the Surface of Cotton Knit Fabric

Authors: P. Altay, G. Kartal, B. Kizilkaya, S. Kahraman, N. C. Gursoy

Abstract:

Biopolishing is applied to remove the fuzz or pills on the fiber or fabric surface which will reduce its tendency to pill or fuzz after repetitive launderings. After biopolishing process, the fuzzes ripped by cellulase enzymes cannot be thoroughly removed from fabric surface, they remain on the fabric or fiber surface; accordingly disturb the user and lead to decrease in productivity of drying process. The main objective of this study is to develop a method for removing weakened fuzz fibers and surface pills from biofinished fabric surface before drying process. Fuzzes in the lattice structure of fabric were completely removed from the internal structure of the fabric by air blowing. The presence of fuzzes leads to problems with formation of pilling and faded appearance; the removal of fuzzes from the fabric results in reduced tendency to pill formation, cleaner, smoother and softer surface, improved handling properties of fabric with maintaining original color.

Keywords: biopolishing, fuzz fiber, weakened fiber, biofinished cotton fabric

Procedia PDF Downloads 367
15662 Quality Improvement of the Sand Moulding Process in Foundries Using Six Sigma Technique

Authors: Cindy Sithole, Didier Nyembwe, Peter Olubambi

Abstract:

The sand casting process involves pattern making, mould making, metal pouring and shake out. Every step in the sand moulding process is very critical for production of good quality castings. However, waste generated during the sand moulding operation and lack of quality are matters that influences performance inefficiencies and lack of competitiveness in South African foundries. Defects produced from the sand moulding process are only visible in the final product (casting) which results in increased number of scrap, reduced sales and increases cost in the foundry. The purpose of this Research is to propose six sigma technique (DMAIC, Define, Measure, Analyze, Improve and Control) intervention in sand moulding foundries and to reduce variation caused by deficiencies in the sand moulding process in South African foundries. Its objective is to create sustainability and enhance productivity in the South African foundry industry. Six sigma is a data driven method to process improvement that aims to eliminate variation in business processes using statistical control methods .Six sigma focuses on business performance improvement through quality initiative using the seven basic tools of quality by Ishikawa. The objectives of six sigma are to eliminate features that affects productivity, profit and meeting customers’ demands. Six sigma has become one of the most important tools/techniques for attaining competitive advantage. Competitive advantage for sand casting foundries in South Africa means improved plant maintenance processes, improved product quality and proper utilization of resources especially scarce resources. Defects such as sand inclusion, Flashes and sand burn on were some of the defects that were identified as resulting from the sand moulding process inefficiencies using six sigma technique. The courses were we found to be wrong design of the mould due to the pattern used and poor ramming of the moulding sand in a foundry. Six sigma tools such as the voice of customer, the Fishbone, the voice of the process and process mapping were used to define the problem in the foundry and to outline the critical to quality elements. The SIPOC (Supplier Input Process Output Customer) Diagram was also employed to ensure that the material and process parameters were achieved to ensure quality improvement in a foundry. The process capability of the sand moulding process was measured to understand the current performance to enable improvement. The Expected results of this research are; reduced sand moulding process variation, increased productivity and competitive advantage.

Keywords: defects, foundries, quality improvement, sand moulding, six sigma (DMAIC)

Procedia PDF Downloads 178
15661 The Effects of the Inference Process in Reading Texts in Arabic

Authors: May George

Abstract:

Inference plays an important role in the learning process and it can lead to a rapid acquisition of a second language. When learning a non-native language, i.e., a critical language like Arabic, the students depend on the teacher’s support most of the time to learn new concepts. The students focus on memorizing the new vocabulary and stress on learning all the grammatical rules. Hence, the students became mechanical and cannot produce the language easily. As a result, they are unable to predict the meaning of words in the context by relying heavily on the teacher, in that they cannot link their prior knowledge or even identify the meaning of the words without the support of the teacher. This study explores how the teacher guides students learning during the inference process and what are the processes of learning that can direct student’s inference.

Keywords: inference, reading, Arabic, language acquisition

Procedia PDF Downloads 522
15660 Optimization of Friction Stir Spot Welding Process Parameters for Joining 6061 Aluminum Alloy Using Taguchi Method

Authors: Mohammed A. Tashkandi, Jawdat A. Al-Jarrah, Masoud Ibrahim

Abstract:

This paper investigates the shear strength of the joints produced by friction stir spot welding process (FSSW). FSSW parameters such as tool rotational speed, plunge depth, shoulder diameter of the welding tool and dwell time play the major role in determining the shear strength of the joints. The effect of these four parameters on FSSW process as well as the shear strength of the welded joints was studied via five levels of each parameter. Taguchi method was used to minimize the number of experiments required to determine the fracture load of the friction stir spot-welded joints by incorporating independently controllable FSSW parameters. Taguchi analysis was applied to optimize the FSSW parameters to attain the maximum shear strength of the spot weld for this type of aluminum alloy.

Keywords: Friction Stir Spot Welding, Al6061 alloy, Shear Strength, FSSW process parameters

Procedia PDF Downloads 417
15659 Modelling the Choice of Global Systems of Mobile Networks in Nigeria Using the Analytical Hierarchy Process

Authors: Awal Liman Sale

Abstract:

The world is fast becoming a global village; and a necessary tool for this process is communication, of which telecommunication is a key player. The quantum development is very rapid as one innovation replaces another in a matter of weeks. Interconnected phone calls across the different Nigerian Telecom service providers are mostly difficult to connect and often diverted, incurring unnecessary charges on the customers. This compels the consumers to register and use multiple subscriber information modules (SIM) so that they can switch to another if one fails. This study aims to identify and prioritize the key factors in selecting telecom service providers by subscribers in Nigeria using the Analytical Hierarchy Process (AHP) in order to match the factors with the GSM network providers and create a hierarchical structure. Opinions of 400 random subscribers of different service providers will be sought using the questionnaire. In general, four components and ten sub-components will be examined in this study. After determining the weight of these components, the importance of each in choosing the service will be prioritized in Nigeria.

Keywords: analytical hierarchy process, global village, Nigerian telecommunication, subscriber information modules

Procedia PDF Downloads 225
15658 Filtering Intrusion Detection Alarms Using Ant Clustering Approach

Authors: Ghodhbani Salah, Jemili Farah

Abstract:

With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.

Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms

Procedia PDF Downloads 394
15657 Social Workers’ Reactions and Coping Strategies: An Exploratory Study about the Social Worker-Client Contacting Experiences in Hong Kong

Authors: Sze Ming Yau

Abstract:

Social worker-client interacting experience is scarcely studied in Hong Kong. Through this qualitative study, the experiences of Hong Kong social work practitioners in relating with clients provide new insights on social worker training and development. Thematic analysis is applied to examine the data collected by in-depth interviews with six local social work practitioners. The results show all practitioners have experienced both positive and challenging situations during the relating process. Their reactions either facilitate or hinder the process. Most of the practitioners’ strong reactions can be accounted for by using the concept of countertransference reactions during the interview session with clients. Moreover, they also have rarely reviewed the implications of those reactions after the session. In addition to countertransference, the self-expectation of practitioners also influences the relating process. Their self-expectations of being capable to help lead to anxiety. Though countertransference and anxiety of practitioners significantly influence the relating process, the practitioners do not adequately address personal issues and anxiety. Enhancing case conceptualization ability is their major coping strategy. The study has implications, including enhancement of social work training, workplace support, practitioner’s self-reflection, and integration of theory and practice.

Keywords: coping, countertransference, reactions, relating process, social workers

Procedia PDF Downloads 254