Search results for: process modeling advancements
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18719

Search results for: process modeling advancements

14609 A New Method for Fault Detection

Authors: Mehmet Hakan Karaata, Ali Hamdan, Omer Yusuf Adam Mohamed

Abstract:

Consider a distributed system that delivers messages from a process to another. Such a system is often required to deliver each message to its destination regardless of whether or not the system components experience arbitrary forms of faults. In addition, each message received by the destination must be a message sent by a system process. In this paper, we first identify the necessary and sufficient conditions to detect some restricted form of Byzantine faults referred to as modifying Byzantine faults. An observable form of a Byzantine fault whose effect is limited to the modification of a message metadata or content, timing and omission faults, and message replay is referred to as a modifying Byzantine fault. We then present a distributed protocol to detect modifying Byzantine faults using optimal number of messages over node-disjoint paths.

Keywords: Byzantine faults, distributed systems, fault detection, network protocols, node-disjoint paths

Procedia PDF Downloads 449
14608 Hypothesis about the Origin of the Lighting

Authors: Igor Kuzminov

Abstract:

Till now, the nature of lightning is not established. A hypothesis of the origin of lightning is proposed. The lightning charge is formed by electromagnetic induction. The role of the conductor is performed by the air mass of the cloud. This conductor moves in the Earth's magnetic field. The upper and lower edges of the cloud are the plates of the capacitor. Lightning is a special case of electromagnetic processes in an atmosphere. The category of lightning occurs in the process of accumulation of a charge. The process of accumulation goes constantly, but the charge is not fixed. Naturally, the hypothesis demands the carrying out of additional experiments and official acknowledgement. As the proof of a hypothesis can serve that the maximal lighting activity in an equatorial zone where cosφ it is close to 1. An experiment conducted privately showed that there is a potential difference in the atmosphere at different levels. The probability of applied value development of power installation is great.

Keywords: electromagnetic induction, Earth's magnetic field, plates of the capacitors, charge accumulation

Procedia PDF Downloads 90
14607 Identifying Physiological Markers That Are Sensitive to Cognitive Load in Preschoolers

Authors: Priyashri Kamlesh Sridhar, Suranga Nanayakkara

Abstract:

Current frameworks in assessment follow lesson delivery and rely heavily on test performance or teacher’s observations. This, however, neglects the underlying cognitive load during the learning process. Identifying the pivotal points when the load occurs helps design effective pedagogies and tools that respond to learners’ cognitive state. There has been limited research on quantifying cognitive load in preschoolers, real-time. In this study, we recorded electrodermal activity and heart rate variability (HRV) from 10 kindergarteners performing executive function tasks and Johnson Woodcock test of cognitive abilities. Preliminary findings suggest that there are indeed sensitive task-dependent markers in skin conductance (number of SCRs and average amplitude of SCRs) and HRV (mean heart rate and low frequency component) captured during the learning process.

Keywords: early childhood, learning, methodologies, pedagogies

Procedia PDF Downloads 322
14606 Computational Assistance of the Research, Using Dynamic Vector Logistics of Processes for Critical Infrastructure Subjects Continuity

Authors: Urbánek Jiří J., Krahulec Josef, Urbánek Jiří F., Johanidesová Jitka

Abstract:

These Computational assistance for the research and modelling of critical infrastructure subjects continuity deal with this paper. It enables us the using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It serves for crisis situations investigation and modelling within the organizations of critical infrastructure. In the first part of the paper, it will be introduced entities, operators and actors of DYVELOP method. It uses just three operators of Boolean algebra and four types of the entities: the Environments, the Process Systems, the Cases and the Controlling. The Process Systems (PrS) have five “brothers”: Management PrS, Transformation PrS, Logistic PrS, Event PrS and Operation PrS. The Cases have three “sisters”: Process Cell Case, Use Case and Activity Case. They all need for the controlling of their functions special Ctrl actors, except ENV – it can do without Ctrl. Model´s maps are named the Blazons and they are able mathematically - graphically express the relationships among entities, actors and processes. In the second part of this paper, the rich blazons of DYVELOP method will be used for the discovering and modelling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission. The crisis management of energetic crisis infrastructure organization is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is a necessary condition for the encompassment of the both the emergency event and the mitigation of organization´s damages. Uninterrupted and continuous cycling process bring for crisis management fruitfulness and it is a good indicator and controlling actor of organizational continuity and its sustainable development advanced possibilities. The research reliable rules are derived for the safety and reliable continuity of energetic critical infrastructure organization in the crisis situation.

Keywords: blazons, computational assistance, DYVELOP method, critical infrastructure

Procedia PDF Downloads 385
14605 The Modeling of Viscous Microenvironment for the Coupled Enzyme System of Bioluminescence Bacteria

Authors: Irina E. Sukovataya, Oleg S. Sutormin, Valentina A. Kratasyuk

Abstract:

Effect of viscosity of media on kinetic parameters of the coupled enzyme system NADH:FMN-oxidoreductase–luciferase was investigated with addition of organic solvents (glycerol and sucrose), because bioluminescent enzyme systems based on bacterial luciferases offer a unique and general tool for analysis of the many analytes and enzymes in the environment, research, and clinical laboratories and other fields. The possibility of stabilization and increase of activity of the coupled enzyme system NADH:FMN-oxidoreductase–luciferase activity in vicious aqueous-organic mixtures have been shown.

Keywords: coupled enzyme system of bioluminescence bacteria NAD(P)H:FMN-oxidoreductase–luciferase, glycerol, stabilization of enzymes, sucrose

Procedia PDF Downloads 399
14604 Mixed Integer Programming-Based One-Class Classification Method for Process Monitoring

Authors: Younghoon Kim, Seoung Bum Kim

Abstract:

One-class classification plays an important role in detecting outlier and abnormality from normal observations. In the previous research, several attempts were made to extend the scope of application of the one-class classification techniques to statistical process control problems. For most previous approaches, such as support vector data description (SVDD) control chart, the design of the control limits is commonly based on the assumption that the proportion of abnormal observations is approximately equal to an expected Type I error rate in Phase I process. Because of the limitation of the one-class classification techniques based on convex optimization, we cannot make the proportion of abnormal observations exactly equal to expected Type I error rate: controlling Type I error rate requires to optimize constraints with integer decision variables, but convex optimization cannot satisfy the requirement. This limitation would be undesirable in theoretical and practical perspective to construct effective control charts. In this work, to address the limitation of previous approaches, we propose the one-class classification algorithm based on the mixed integer programming technique, which can solve problems formulated with continuous and integer decision variables. The proposed method minimizes the radius of a spherically shaped boundary subject to the number of normal data to be equal to a constant value specified by users. By modifying this constant value, users can exactly control the proportion of normal data described by the spherically shaped boundary. Thus, the proportion of abnormal observations can be made theoretically equal to an expected Type I error rate in Phase I process. Moreover, analogous to SVDD, the boundary can be made to describe complex structures by using some kernel functions. New multivariate control chart applying the effectiveness of the algorithm is proposed. This chart uses a monitoring statistic to characterize the degree of being an abnormal point as obtained through the proposed one-class classification. The control limit of the proposed chart is established by the radius of the boundary. The usefulness of the proposed method was demonstrated through experiments with simulated and real process data from a thin film transistor-liquid crystal display.

Keywords: control chart, mixed integer programming, one-class classification, support vector data description

Procedia PDF Downloads 176
14603 Process of Production of an Artisanal Brewery in a City in the North of the State of Mato Grosso, Brazil

Authors: Ana Paula S. Horodenski, Priscila Pelegrini, Salli Baggenstoss

Abstract:

The brewing industry with artisanal concepts seeks to serve a specific market, with diversified production that has been gaining ground in the national environment, also in the Amazon region. This growth is due to the more demanding consumer, with a diversified taste that wants to try new types of beer, enjoying products with new aromas, flavors, as a differential of what is so widely spread through the big industrial brands. Thus, through qualitative research methods, the study aimed to investigate how is the process of managing the production of a craft brewery in a city in the northern State of Mato Grosso (BRAZIL), providing knowledge of production processes and strategies in the industry. With the efficient use of resources, it is possible to obtain the necessary quality and provide better performance and differentiation of the company, besides analyzing the best management model. The research is descriptive with a qualitative approach through a case study. For the data collection, a semi-structured interview was elaborated, composed of the areas: microbrewery characterization, artisan beer production process, and the company supply chain management. Also, production processes were observed during technical visits. With the study, it was verified that the artisan brewery researched develops preventive maintenance strategies with the inputs, machines, and equipment, so that the quality of the product and the production process are achieved. It was observed that the distance from the supplying centers makes the management of processes and the supply chain be carried out with a longer planning time so that the delivery of the final product is satisfactory. The production process of the brewery is composed of machines and equipment that allows the control and quality of the product, which the manager states that for the productive capacity of the industry and its consumer market, the available equipment meets the demand. This study also contributes to highlight one of the challenges for the development of small breweries in front of the market giants, that is, the legislation, which fits the microbreweries as producers of alcoholic beverages. This makes the micro and small business segment to be taxed as a major, who has advantages in purchasing large batches of raw materials and tax incentives because they are large employers and tax pickers. It was possible to observe that the supply chain management system relies on spreadsheets and notes that are done manually, which could be simplified with a computer program to streamline procedures and reduce risks and failures of the manual process. In relation to the control of waste and effluents affected by the industry is outsourced and meets the needs. Finally, the results showed that the industry uses preventive maintenance as a productive strategy, which allows better conditions for the production and quality of artisanal beer. The quality is directly related to the satisfaction of the final consumer, being prized and performed throughout the production process, with the selection of better inputs, the effectiveness of the production processes and the relationship with the commercial partners.

Keywords: artisanal brewery, production management, production processes, supply chain

Procedia PDF Downloads 123
14602 The Fake News Impact on the Public Policy Cycle: A Systemic Analysis through Documentary Survey

Authors: Aron Miranda Burgos, Ergon Cugler de Moraes Silva

Abstract:

In the present article, it is observed that the constant advancement of issues related to misinformation impacts the guarantee of the public policy cycle. Thus, it is found that the dissemination of false information has a direct influence on each of the component stages of this cycle. Therefore, in order to maintain scientific and theoretical credibility in the qualitative analysis process, it was necessary to logically interpose the concepts of firehosing of falsehood, fake news, public policy cycle, as well as using the epistemological and pragmatic mechanism at the intersection of such academic concepts, such as the scientific method. It was found, through the analysis of official documents and public notes, how the multiple theoretical perspectives evidence the commitment of the provision and elaboration of public policies, verifying the way in which the fake news impact each part of the process in this atmosphere.

Keywords: firehosing of falsehood, governance, misinformation, post-truth

Procedia PDF Downloads 143
14601 Separation of CO2 Using MFI-Alumina Nanocomposite Hollow Fibre Ion-Exchanged with Alkali Metal Cation

Authors: A. Alshebani, Y. Swesi, S. Mrayed, F. Altaher, I. Musbah

Abstract:

Cs-type nanocomposite zeolite membrane was successfully synthesized on a alumina ceramic hollow fibre with a mean outer diameter of 1.7 mm, cesium cationic exchange test was carried out inside test module with mean wall thickness of 230 μm and an average crossing pore size smaller than 0.2 μm. Separation factor of n-butane/H2 obtained indicate that a relatively high quality closed to 20. Maxwell-Stefan modeling provides an equivalent thickness lower than 1 µm. To compare the difference an application to CO2/N2 separation has been achieved, reaching separation factors close to (4,18) before and after cation exchange on H-zeolite membrane formed within the pores of a ceramic alumina substrate.

Keywords: MFI membrane, CO2, nanocomposite, ceramic hollow fibre, ion-exchange

Procedia PDF Downloads 486
14600 The Variable Sampling Interval Xbar Chart versus the Double Sampling Xbar Chart

Authors: Michael B. C. Khoo, J. L. Khoo, W. C. Yeong, W. L. Teoh

Abstract:

The Shewhart Xbar control chart is a useful process monitoring tool in manufacturing industries to detect the presence of assignable causes. However, it is insensitive in detecting small process shifts. To circumvent this problem, adaptive control charts are suggested. An adaptive chart enables at least one of the chart’s parameters to be adjusted to increase the chart’s sensitivity. Two common adaptive charts that exist in the literature are the double sampling (DS) Xbar and variable sampling interval (VSI) Xbar charts. This paper compares the performances of the DS and VSI Xbar charts, based on the average time to signal (ATS) criterion. The ATS profiles of the DS Xbar and VSI Xbar charts are obtained using the Mathematica and Statistical Analysis System (SAS) programs, respectively. The results show that the VSI Xbar chart is generally superior to the DS Xbar chart.

Keywords: adaptive charts, average time to signal, double sampling, charts, variable sampling interval

Procedia PDF Downloads 290
14599 A Multi-Layer Based Architecture for the Development of an Open Source CAD/CAM Integration Virtual Platform

Authors: Alvaro Aguinaga, Carlos Avila, Edgar Cando

Abstract:

This article proposes a n-layer architecture, with a web client as a front-end, for the development of a virtual platform for process simulation on CNC machines. This Open-Source platform includes a CAD-CAM interface drawing primitives, and then used to furnish a CNC program that triggers a touch-screen virtual simulator. The objectives of this project are twofold. First one is an educational component that fosters new alternatives for the CAD-CAM/CNC learning process in undergrad and grade schools and technical and technological institutes emphasizing in the development of critical skills, discussion and collaborative work. The second objective puts together a research and technological component that will take the state of the art in CAD-CAM integration to a new level with the development of optimal algorithms and virtual platforms, on-line availability, that will pave the way for the long-term goal of this project, that is, to have a visible and active graduate school in Ecuador and a world wide Open-Innovation community in the area of CAD-CAM integration and operation of CNC machinery. The virtual platform, developed as a part of this study: (1) delivers improved training process of students, (2) creates a multidisciplinary team and a collaborative work space that will push the new generation of students to face future technological challenges, (3) implements industry standards for CAD/CAM, (4) presents a platform for the development of industrial applications. A protoype of this system was developed and implemented in a network of universities and technological institutes in Ecuador.

Keywords: CAD-CAM integration, virtual platforms, CNC machines, multi-layer based architecture

Procedia PDF Downloads 432
14598 Modeling SET Effect on Charge Pump Phase Locked Loop

Authors: Varsha Prasad, S. Sandya

Abstract:

Cosmic Ray effects in microelectronics such as single event effect (SET) and total dose ionization (TID) have been of major concern in space electronics since 1970. Advanced CMOS technologies have demonstrated reduced sensitivity to TID effect. However, charge pump Phase Locked Loop is very much vulnerable to single event transient effect. This paper presents an SET analysis model, where the SET is modeled as a double exponential pulse. The time domain analysis reveals that the settling time of the voltage controlled oscillator (VCO) depends on the SET pulse strength, setting the time constant and the damping factor. The analysis of the proposed SET analysis model is confirmed by the simulation results.

Keywords: charge pump, phase locked loop, SET, VCO

Procedia PDF Downloads 435
14597 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 366
14596 Implementing Internet of Things through Building Information Modelling in Order to Assist with the Maintenance Stage of Commercial Buildings

Authors: Ushir Daya, Zenadene Lazarus, Dimelle Moodley, Ehsan Saghatforoush

Abstract:

It was found through literature that there is a lack of implementation of the Internet of Things (IoT) incorporated into Building Information Modelling (BIM) in South Africa. The research aims to find if the implementation of IoT into BIM will make BIM more useful during the maintenance stage of buildings and assist facility managers when doing their job. The research will look at the existing problematic areas with building information modelling, specifically BIM 7D. This paper will look at the capabilities of IoT and what issues IoT will be able to resolve in BIM software, as well as how IoT into BIM will assist facility managers and if such an implementation will make a facility manager's job more efficient.

Keywords: internet of things, building information modeling, facilities management, structural health monitoring

Procedia PDF Downloads 214
14595 Role of Self-Concept in the Relationship between Emotional Abuse and Mental Health of Employees in the North West Province, South Africa

Authors: L. Matlawe, E. S. Idemudia

Abstract:

The stability is an important topic to plan and manage the energy in the microgrids as the same as the conventional power systems. The voltage and frequency stability is one of the most important issues recently studied in microgrids. The objectives of this paper are the modeling and designing of the components and optimal controllers for the voltage and frequency control of the AC/DC hybrid microgrid under the different disturbances. Since the PI controllers have the advantages of simple structure and easy implementation, so they were designed and modeled in this paper. The harmony search (HS) algorithm is used to optimize the controllers’ parameters. According to the achieved results, the PI controllers have a good performance in voltage and frequency control of the microgrid.

Keywords: emotional abuse, employees, mental health, self-concept

Procedia PDF Downloads 259
14594 Knowledge Elicitation Approach for Formal Ontology Design: An Exploratory Study Applied in Industry for Knowledge Management

Authors: Ouassila Labbani-Narsis, Christophe Nicolle

Abstract:

Building formal ontologies remains a complex process for companies. In the literature, this process is based on the technical knowledge and expertise of domain experts, without further details on the used methodologies. Possible problems of disagreements between experts, expression of tacit knowledge related to high level know-how rarely verbalized, qualification of results by using cases, or simply adhesion of the group of experts, remain currently unsolved. This paper proposes a methodological approach based on knowledge elicitation for the conception of formal, consensual, and shared ontologies. The proposed approach is experimentally tested on industrial collaboration projects in the field of manufacturing (associating knowledge sources from multinational companies) and in the field of viticulture (associating explicit knowledge and implicit knowledge acquired through observation).

Keywords: collaborative ontology engineering, knowledge elicitation, knowledge engineering, knowledge management

Procedia PDF Downloads 81
14593 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation

Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi

Abstract:

Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.

Keywords: integral production, level set method, morphological operation, segmentation

Procedia PDF Downloads 322
14592 A Unified Fitting Method for the Set of Unified Constitutive Equations for Modelling Microstructure Evolution in Hot Deformation

Authors: Chi Zhang, Jun Jiang

Abstract:

Constitutive equations are very important in finite element (FE) modeling, and the accuracy of the material constants in the equations have significant effects on the accuracy of the FE models. A wide range of constitutive equations are available; however, fitting the material constants in the constitutive equations could be complex and time-consuming due to the strong non-linearity and relationship between the constants. This work will focus on the development of a set of unified MATLAB programs for fitting the material constants in the constitutive equations efficiently. Users will only need to supply experimental data in the required format and run the program without modifying functions or precisely guessing the initial values, or finding the parameters in previous works and will be able to fit the material constants efficiently.

Keywords: constitutive equations, FE modelling, MATLAB program, non-linear curve fitting

Procedia PDF Downloads 101
14591 An Evaluation of the Artificial Neural Network and Adaptive Neuro Fuzzy Inference System Predictive Models for the Remediation of Crude Oil-Contaminated Soil Using Vermicompost

Authors: Precious Ehiomogue, Ifechukwude Israel Ahuchaogu, Isiguzo Edwin Ahaneku

Abstract:

Vermicompost is the product of the decomposition process using various species of worms, to create a mixture of decomposing vegetable or food waste, bedding materials, and vemicast. This process is called vermicomposting, while the rearing of worms for this purpose is called vermiculture. Several works have verified the adsorption of toxic metals using vermicompost but the application is still scarce for the retention of organic compounds. This research brings to knowledge the effectiveness of earthworm waste (vermicompost) for the remediation of crude oil contaminated soils. The remediation methods adopted in this study were two soil washing methods namely, batch and column process which represent laboratory and in-situ remediation. Characterization of the vermicompost and crude oil contaminated soil were performed before and after the soil washing using Fourier transform infrared (FTIR), scanning electron microscopy (SEM), X-ray fluorescence (XRF), X-ray diffraction (XRD) and Atomic adsorption spectrometry (AAS). The optimization of washing parameters, using response surface methodology (RSM) based on Box-Behnken Design was performed on the response from the laboratory experimental results. This study also investigated the application of machine learning models [Artificial neural network (ANN), Adaptive neuro fuzzy inference system (ANFIS). ANN and ANFIS were evaluated using the coefficient of determination (R²) and mean square error (MSE)]. Removal efficiency obtained from the Box-Behnken design experiment ranged from 29% to 98.9% for batch process remediation. Optimization of the experimental factors carried out using numerical optimization techniques by applying desirability function method of the response surface methodology (RSM) produce the highest removal efficiency of 98.9% at absorbent dosage of 34.53 grams, adsorbate concentration of 69.11 (g/ml), contact time of 25.96 (min), and pH value of 7.71, respectively. Removal efficiency obtained from the multilevel general factorial design experiment ranged from 56% to 92% for column process remediation. The coefficient of determination (R²) for ANN was (0.9974) and (0.9852) for batch and column process, respectively, showing the agreement between experimental and predicted results. For batch and column precess, respectively, the coefficient of determination (R²) for RSM was (0.9712) and (0.9614), which also demonstrates agreement between experimental and projected findings. For the batch and column processes, the ANFIS coefficient of determination was (0.7115) and (0.9978), respectively. It can be concluded that machine learning models can predict the removal of crude oil from polluted soil using vermicompost. Therefore, it is recommended to use machines learning models to predict the removal of crude oil from contaminated soil using vermicompost.

Keywords: ANFIS, ANN, crude-oil, contaminated soil, remediation and vermicompost

Procedia PDF Downloads 114
14590 Use of Diatomite for the Elimination of Chromium Three from Wastewater Annaba, Algeria

Authors: Sabiha Chouchane, Toufik Chouchane, Azzedine Hani

Abstract:

The wastewater was treated with a natural asorbent “Diatomite” to eliminate chromium three. Diatomite is an element that comes from Sig (west of Algeria). The physicochemical characterization revealed that the diatomite is mainly made up of silica, lime and a lower degree of alumina. The process considered in static regime, at 20°C, an ion stirring speed of 150 rpm, a pH = 4 and a grain diameter of between 100 and 150µm, shows that one gram of diatomite purified can fix according to the Langmuir model up to 39.64 mg/g of chromium with pseudo 1st order kinetics. The pseudo-equilibrium time highlighted is 25 minutes. The affinity between the adsorbent and the adsorbate follows the value of the RL ratio indicates us that the solid used has a good adsorption capacity. The external transport of the metal ions from the solution to the adsorbent seems to be a step controlling the speed of the overall process. On the other hand, internal transport in the pores is not the only limiting mechanism of sorption kinetics. Thermodynamic parameters show that chromium sorption is spontaneous and exothermic with negative entropy.

Keywords: adsorption, diatomite, crIII, wastewater

Procedia PDF Downloads 60
14589 Numerical Simulation of Bio-Chemical Diffusion in Bone Scaffolds

Authors: Masoud Madadelahi, Amir Shamloo, Seyedeh Sara Salehi

Abstract:

Previously, some materials like solid metals and their alloys have been used as implants in human’s body. In order to amend fixation of these artificial hard human tissues, some porous structures have been introduced. In this way, tissues in vicinity of the porous structure can be attached more easily to the inserted implant. In particular, the porous bone scaffolds are useful since they can deliver important biomolecules like growth factors and proteins. This study focuses on the properties of the degradable porous hard tissues using a three-dimensional numerical Finite Element Method (FEM). The most important studied properties of these structures are diffusivity flux and concentration of different species like glucose, oxygen, and lactate. The process of cells migration into the scaffold is considered as a diffusion process, and related parameters are studied for different values of production/consumption rates.

Keywords: bone scaffolds, diffusivity, numerical simulation, tissue engineering

Procedia PDF Downloads 389
14588 The Impact of Bitcoin and Cryptocurrency on the Development of Community

Authors: Felib Ayman Shawky Salem

Abstract:

Nowadays crypto currency has become a global phenomenon known to most people. People using this alternative digital money to do a transaction in many ways (e.g. Used for online shopping, wealth management, and fundraising). However, this digital asset also widely used in criminal activities since its use decentralized control as opposed to centralized electronic money and central banking systems and this makes a user, who used this currency invisible. The high-value exchange of these digital currencies also has been a target to criminal activities. The crypto currency crimes have become a challenge for the law enforcement to analyze and to proof the evidence as criminal devices. In this paper, our focus is more on bitcoin crypto currency and the possible artifacts that can be obtained from the different type of digital wallet, which is software and browser-based application. The process memory and physical hard disk are examined with the aims of identifying and recovering potential digital evidence. The stage of data acquisition divided by three states which are the initial creation of the wallet, transaction that consists transfer and receiving a coin and the last state is after the wallet is being deleted. Findings from this study suggest that both data from software and browser type of wallet process memory is a valuable source of evidence, and many of the artifacts found in process memory are also available from the application and wallet files on the client computer storage.

Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriationBitCoin, financial protection, crypto currency, money laundering cryptocurrency, digital wallet, digital forensics

Procedia PDF Downloads 48
14587 Musical Notation Reading versus Alphabet Reading-Comparison and Implications for Teaching Music Reading to Students with Dyslexia

Authors: Ora Geiger

Abstract:

Reading is a cognitive process of deciphering visual signs to produce meaning. During the reading process, written information of symbols and signs is received in the person’s eye and processed in the brain. This definition is relevant to both the reading of letters and the reading of musical notation. But while the letters of the alphabet are signs determined arbitrarily, notes are recorded systematically on a staff, with the location of each note on the staff indicating its relative pitch. In this paper, the researcher specifies the characteristics of alphabet reading in comparison to musical notation reading, and discusses the question whether a person diagnosed with dyslexia will necessarily have difficulty in reading musical notes. Dyslexia is a learning disorder that makes it difficult to acquire alphabet-reading skills due to difficulties expressed in the identification of letters, spelling, and other language deciphering skills. In order to read, one must be able to connect a symbol with a sound and to join the sounds into words. A person who has dyslexia finds it difficult to translate a graphic symbol into the sound that it represents. When teaching reading to children diagnosed with dyslexia, the multi-sensory approach, supporting the activation and involvement of most of the senses in the learning process, has been found to be particularly effective. According to this approach, when most senses participate in the reading learning process, it becomes more effective. During years of experience, the researcher, who is a music specialist, has been following the music reading learning process of elementary school age students, some of them diagnosed with Dyslexia, while studying to play soprano (descant) recorder. She argues that learning music reading while studying to play a musical instrument is a multi-sensory experience by its nature. The senses involved are: sight, hearing, touch, and the kinesthetic sense (motion), which provides the brain with information on the relative positions of the body. In this way, the learner experiences simultaneously visual, auditory, tactile, and kinesthetic impressions. The researcher concludes that there should be no contra-indication for teaching standard music reading to children with dyslexia if an appropriate process is offered. This conclusion is based on two main characteristics of music reading: (1) musical notation system is a systematic, logical, relative set of symbols written on a staff; and (2) music reading learning connected with playing a musical instrument is by its nature a multi-sensory activity since it combines sight, hearing, touch, and movement. This paper describes music reading teaching procedures and provides unique teaching methods that have been found to be effective for students who were diagnosed with Dyslexia. It provides theoretical explanations in addition to guidelines for music education practices.

Keywords: alphabet reading, dyslexia, multisensory teaching method, music reading, recorder playing

Procedia PDF Downloads 367
14586 An Investigation into Mechanical Properties of Laser Fabricated 308LSi Stainless Steel Walls by Wire Feedstock

Authors: Taiwo Ebenezer Abioye, Alexis Medrano-Tellez, Peter Kayode Farayibi, Peter Kayode Oke,

Abstract:

Laser metal deposition by wire feedstock has been established as a process which can provide a high material deposition rate with good quality. Sound mechanical properties of the deposited parts are the pre-requisites for the real applications of this process. This paper investigates the laser metal deposition of 308LSi stainless steel wire within a process window. Single tracks and multiple layer thin-walls of 308LSi stainless steel wire were deposited on 304 stainless steel substrate. The grain structures of the built walls were examined using optical microscopy. The mechanical properties of the built walls including the micro-hardness and tensile properties along the transverse and longitudinal directions were investigated using Vickers hardness tester and tensile test machine. Long columnar grains were found growing in the wall building direction (transverse) and nucleation were observed at the boundary between two deposited layers due to remelting of the previously deposited layers. The results showed that the hardness values of the deposited walls (ranging between 194 HV and 167 HV) decreased from the track-substrate interface to the top of the wall. The ultimate tensile strength (UTS) of the wall (518 ± 7 MPa) showed dependence on wall building directions.

Keywords: laser metal deposition, ultimate tensile strength, hardness, wall, microstructure

Procedia PDF Downloads 417
14585 Optimizing of Machining Parameters of Plastic Material Using Taguchi Method

Authors: Jumazulhisham Abdul Shukor, Mohd. Sazali Said, Roshanizah Harun, Shuib Husin, Ahmad Razlee Ab Kadir

Abstract:

This paper applies Taguchi Optimization Method in determining the best machining parameters for pocket milling process on Polypropylene (PP) using CNC milling machine where the surface roughness is considered and the Carbide inserts cutting tool are used. Three machining parameters; speed, feed rate and depth of cut are investigated along three levels; low, medium and high of each parameter (Taguchi Orthogonal Arrays). The setting of machining parameters were determined by using Taguchi Method and the Signal-to-Noise (S/N) ratio are assessed to define the optimal levels and to predict the effect of surface roughness with assigned parameters based on L9. The final experimental outcomes are presented to prove the optimization parameters recommended by manufacturer are accurate.

Keywords: inserts, milling process, signal-to-noise (S/N) ratio, surface roughness, Taguchi Optimization Method

Procedia PDF Downloads 642
14584 Analyzing the Technology Affecting on the Social Integration of Students at University

Authors: Sujit K. Basak, Simon Collin

Abstract:

The aim of this paper is to examine the technology access and use on the affecting social integration of local students at university. This aim is achieved by designing a structural equation modeling (SEM) in terms of integration with peers, integration with faculty, faculty support and on the other hand, examining the socio demographic impact on the technology access and use. The collected data were analyzed using the WarpPLS 5.0 software. This study was survey based and it was conducted at a public university in Canada. The results of the study indicated that technology has a strong impact on integration with faculty, faculty support, but technology does not have an impact on integration with peers. However, the social demographic has also an impact on the technology access and use.

Keywords: faculty, integration, peer, technology access and use

Procedia PDF Downloads 516
14583 Business Intelligence for Profiling of Telecommunication Customer

Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro

Abstract:

Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.

Keywords: business intelligence, customer segmentation, data warehouse, data mining

Procedia PDF Downloads 488
14582 Separation of Rare-Earth Metals from E-Wastes

Authors: Gulsara Akanova, Akmaral Ismailova, Duisek Kamysbayev

Abstract:

The separation of rare earth metals (REM) from a neodymium magnet has been widely studied in the last year. The waste of computer hard disk contains 25.41 % neodymium, 64.09 % iron, and <<1 % boron. To further the separation of rare-earth metals, the magnet dissolved in open and closed systems with nitric acid. In the closed system, the magnet was dissolved in a microwave sample preparation system at different temperatures and pressures and the dissolution process lasted 1 hour. In the open system, the acid dissolution of the magnet was conducted at room temperature and the process lasted 30-40 minutes. To remove the iron in the magnet, oxalic acid was used and precipitated as oxalates under both conditions. For separation of rare earth metals (Nd, Pr and Dy) from magnet waste is used sorption method.

Keywords: dissolution of the magnet, Neodymium magnet, rare earth metals, separation, Sorption

Procedia PDF Downloads 212
14581 Determining of Importance Level of Factors Affecting Job Selection with the Method of AHP

Authors: Nurullah Ekmekci, Ömer Akkaya, Kazım Karaboğa, Mahmut Tekin

Abstract:

Job selection is one of the most important decisions that affect their lives in the name of being more useful to themselves and the society. There are many criteria to consider in the job selection. The amount of criteria in the job selection makes it a multi-criteria decision-making (MCDM) problem. In this study; job selection has been discussed as multi-criteria decision-making problem and has been solved by Analytic Hierarchy Process (AHP), one of the multi-criteria decision making methods. A survey, contains 5 different job selection criteria (finding a job friendliness, salary status, job , social security, work in the community deems reputation and business of the degree of difficulty) within many job selection criteria and 4 different job alternative (being academician, working at the civil service, working at the private sector and working at in their own business), has been conducted to the students of Selcuk University Faculty of Economics and Administrative Sciences. As a result of pairwise comparisons, the highest weighted criteria in the job selection and the most coveted job preferences were identified.

Keywords: analytical hierarchy process, job selection, multi-criteria, decision making

Procedia PDF Downloads 405
14580 Modeling Thin Shell Structures by a New Flat Shell Finite Element

Authors: Djamal Hamadi, Ashraf Ayoub, Ounis Abdelhafid, Chebili Rachid

Abstract:

In this paper, a new computationally-efficient rectangular flat shell finite element named 'ACM_RSBEC' is presented. The formulated element is obtained by superposition of a new rectangular membrane element 'RSBEC' based on the strain approach and the well known plate bending element 'ACM'. This element can be used for the analysis of thin shell structures, no matter how the geometrical shape might be. Tests on standard problems have been examined. The convergence of the new formulated element is also compared to other types of quadrilateral shell elements. The presented shell element ‘ACM_RSBEC’ has been demonstrated to be effective and useful in analysing thin shell structures.

Keywords: finite element, flat shell element, strain based approach, static condensation

Procedia PDF Downloads 434