Search results for: four step
2880 The Effect of Cross-Curriculum of L1 and L2 on Elementary School Students’ Linguistic Proficiency: To Sympathize with Others
Authors: Reiko Yamamoto
Abstract:
This paper reports on a project to integrate Japanese (as a first language) and English (as a second language) education. This study focuses on the mutual effects of the two languages on the linguistic proficiency of elementary school students. The research team consisted of elementary school teachers and researchers at a university. The participants of the experiment were students between 3rd and 6th grades at an elementary school. The research process consisted of seven steps: 1) specifying linguistic proficiency; 2) developing the cross-curriculum of L1 and L2; 3) forming can-do statements; 4) creating a self-evaluation questionnaire; 5) executing the self-evaluation questionnaire at the beginning of the school year; 6) instructing L1 and L2 based on the curriculum; and 7) executing the self-evaluation questionnaire at the beginning of the next school year. In Step 1, the members of the research team brainstormed ways to specify elementary school students’ linguistic proficiency that can be observed in various scenes. It was revealed that the teachers evaluate their students’ linguistic proficiency on the basis of the students’ utterances, but also informed by their non-verbal communication abilities. This led to the idea that competency for understanding others’ minds through the use of physical movement or bodily senses in communication in L1 – to sympathize with others – can be transferred to that same competency in communication in L2. Based on the specification of linguistic proficiency that L1 and L2 have in common, a cross-curriculum of L1 and L2 was developed in Step 2. In Step 3, can-do statements based on the curriculum were also formed, building off of the action-oriented approach from the Common European Framework of Reference for Languages (CEFR) used in Europe. A self-evaluation questionnaire consisting of the main can-do statements was given to the students between 3rd grade and 6th grade at the beginning of the school year (Step 4 and Step 5), and all teachers gave L1 and L2 instruction based on the curriculum to the students for one year (Step 6). The same questionnaire was given to the students at the beginning of the next school year (Step 7). The results of statistical analysis proved the enhancement of the students’ linguistic proficiency. This verified the validity of developing the cross-curriculum of L1 and L2 and adapting it in elementary school. It was concluded that elementary school students do not distinguish between L1 and L2, and that they just try to understand others’ minds through physical movement or senses in any language.Keywords: cross curriculum of L1 and L2, elementary school education, language proficiency, sympathy with others
Procedia PDF Downloads 4382879 Exoskeleton for Hemiplegic Patients: Mechatronic Approach to Move One Disabled Lower Limb
Authors: Alaoui Hamza, Moutacalli Mohamed Tarik, Chebak Ahmed
Abstract:
The number of people suffering from hemiplegia is growing each year. This lower limb disability affects all the aspects of their lives by taking away their autonomy. This implicates their close relatives, as well as the health system to provide the necessary care they need. The integration of exoskeletons in the medical field became a promising solution to resolve this issue. This paper presents an exoskeleton designed to help hemiplegic people get back the sensation and ability of normal walking. For this purpose, three step models have been created. The first step allows a simple forward movement of the leg. The second method is designed to overcome some obstacles in the patient path, and finally the third step model gives the patient total control over the device. Each of the control methods was designed to offer a solution to the challenges that the patients may face during the walking process.Keywords: ability of normal walking, exoskeleton, hemiplegic patients, lower limb motion- mechatronics
Procedia PDF Downloads 1532878 Novel Adomet Analogs as Tools for Nucleic Acids Labeling
Authors: Milda Nainyte, Viktoras Masevicius
Abstract:
Biological methylation is a methyl group transfer from S-adenosyl-L-methionine (AdoMet) onto N-, C-, O- or S-nucleophiles in DNA, RNA, proteins or small biomolecules. The reaction is catalyzed by enzymes called AdoMet-dependent methyltransferases (MTases), which represent more than 3 % of the proteins in the cell. As a general mechanism, the methyl group from AdoMet replaces a hydrogen atom of nucleophilic center producing methylated DNA and S-adenosyl-L-homocysteine (AdoHcy). Recently, DNA methyltransferases have been used for the sequence-specific, covalent labeling of biopolymers. Two types of MTase catalyzed labeling of biopolymers are known, referred as two-step and one-step. During two-step labeling, an alkylating fragment is transferred onto DNA in a sequence-specific manner and then the reporter group, such as biotin, is attached for selective visualization using suitable chemistries of coupling. This approach of labeling is quite difficult and the chemical hitching does not always proceed at 100 %, but in the second step the variety of reporter groups can be selected and that gives the flexibility for this labeling method. In the one-step labeling, AdoMet analog is designed with the reporter group already attached to the functional group. Thus, the one-step labeling method would be more comfortable tool for labeling of biopolymers in order to prevent additional chemical reactions and selection of reaction conditions. Also, time costs would be reduced. However, effective AdoMet analog appropriate for one-step labeling of biopolymers and containing cleavable bond, required for reduction of PCR interferation, is still not known. To expand the practical utility of this important enzymatic reaction, cofactors with activated sulfonium-bound side-chains have been produced and can serve as surrogate cofactors for a variety of wild-type and mutant DNA and RNA MTases enabling covalent attachment of these chains to their target sites in DNA, RNA or proteins (the approach named methyltransferase-directed Transfer of Activated Groups, mTAG). Compounds containing hex-2-yn-1-yl moiety has proved to be efficient alkylating agents for labeling of DNA. Herein we describe synthetic procedures for the preparation of N-biotinoyl-N’-(pent-4-ynoyl)cystamine starting from the coupling of cystamine with pentynoic acid and finally attaching the biotin as a reporter group. The synthesis of the first AdoMet based cofactor containing a cleavable reporter group and appropriate for one-step labeling was developed.Keywords: adoMet analogs, DNA alkylation, cofactor, methyltransferases
Procedia PDF Downloads 1952877 A Novel Geometrical Approach toward the Mechanical Properties of Particle Reinforced Composites
Authors: Hamed Khezrzadeh
Abstract:
Many investigations on the micromechanical structure of materials indicate that there exist fractal patterns at the micro scale in some of the main construction and industrial materials. A recently presented micro-fractal theory brings together the well-known periodic homogenization and the fractal geometry to construct an appropriate model for determination of the mechanical properties of particle reinforced composite materials. The proposed multi-step homogenization scheme considers the mechanical properties of different constituent phases in the composite together with the interaction between these phases throughout a step-by-step homogenization technique. In the proposed model the interaction of different phases is also investigated. By using this method the effect of fibers grading on the mechanical properties also could be studied. The theory outcomes are compared to the experimental data for different types of particle-reinforced composites which very good agreement with the experimental data is observed.Keywords: fractal geometry, homogenization, micromehcanics, particulate composites
Procedia PDF Downloads 2922876 Graph Planning Based Composition for Adaptable Semantic Web Services
Authors: Rihab Ben Lamine, Raoudha Ben Jemaa, Ikram Amous Ben Amor
Abstract:
This paper proposes a graph planning technique for semantic adaptable Web Services composition. First, we use an ontology based context model for extending Web Services descriptions with information about the most suitable context for its use. Then, we transform the composition problem into a semantic context aware graph planning problem to build the optimal service composition based on user's context. The construction of the planning graph is based on semantic context aware Web Service discovery that allows for each step to add most suitable Web Services in terms of semantic compatibility between the services parameters and their context similarity with the user's context. In the backward search step, semantic and contextual similarity scores are used to find best composed Web Services list. Finally, in the ranking step, a score is calculated for each best solution and a set of ranked solutions is returned to the user.Keywords: semantic web service, web service composition, adaptation, context, graph planning
Procedia PDF Downloads 5202875 Design and Implementation of a Counting and Differentiation System for Vehicles through Video Processing
Authors: Derlis Gregor, Kevin Cikel, Mario Arzamendia, Raúl Gregor
Abstract:
This paper presents a self-sustaining mobile system for counting and classification of vehicles through processing video. It proposes a counting and classification algorithm divided in four steps that can be executed multiple times in parallel in a SBC (Single Board Computer), like the Raspberry Pi 2, in such a way that it can be implemented in real time. The first step of the proposed algorithm limits the zone of the image that it will be processed. The second step performs the detection of the mobile objects using a BGS (Background Subtraction) algorithm based on the GMM (Gaussian Mixture Model), as well as a shadow removal algorithm using physical-based features, followed by morphological operations. In the first step the vehicle detection will be performed by using edge detection algorithms and the vehicle following through Kalman filters. The last step of the proposed algorithm registers the vehicle passing and performs their classification according to their areas. An auto-sustainable system is proposed, powered by batteries and photovoltaic solar panels, and the data transmission is done through GPRS (General Packet Radio Service)eliminating the need of using external cable, which will facilitate it deployment and translation to any location where it could operate. The self-sustaining trailer will allow the counting and classification of vehicles in specific zones with difficult access.Keywords: intelligent transportation system, object detection, vehicle couting, vehicle classification, video processing
Procedia PDF Downloads 3222874 Investigation of the Effect of Excavation Step in NATM on Surface Settlement by Finite Element Method
Authors: Seyed Mehrdad Gholami
Abstract:
Nowadays, using rail transport system (Metro) is increased in most cities of The world, so the need for safe and economical way of building tunnels and subway stations is felt more and more. One of the most commonly used methods for constructing underground structures in urban areas is NATM (New Austrian tunneling method). In this method, there are some key parameters such as excavation steps and cross-sectional area that have a significant effect on the surface settlement. Settlement is a very important control factor related to safe excavation. In this paper, Finite Element Method is used by Abaqus. R6 station of Tehran Metro Line 6 is built by NATM and the construction of that is studied and analyzed. Considering the outcomes obtained from numerical modeling and comparison with the results of the instrumentation and monitoring of field, finally, the excavation step of 1 meter and longitudinal distance of 14 meters between side drifts is suggested to achieve safe tunneling with allowable settlement.Keywords: excavation step, NATM, numerical modeling, settlement.
Procedia PDF Downloads 1392873 Synthesis and Characterization of Cyclic PNC-28 Peptide, Residues 17–26 (ETFSDLWKLL), A Binding Domain of p53
Authors: Deepshikha Verma, V. N. Rajasekharan Pillai
Abstract:
The present study reports the synthesis of cyclic PNC-28 peptides with solid-phase peptide synthesis method. In the first step, we synthesize the linear PNC-28 Peptide and in the second step, we cyclize (N-to-C or head-to-tail cyclization) the linear PNC-28 peptide. The molecular formula of cyclic PNC-28 peptide is C64H88N12O16 and its m/z mass is ≈1233.64. Elemental analysis of cyclic PNC-28 is C, 59.99; H, 6.92; N, 13.12; O, 19.98. The characterization of LC-MS, CD, FT-IR, and 1HNMR has been done to confirm the successful synthesis and cyclization of linear PNC-28 peptides.Keywords: CD, FTIR, 1HNMR, cyclic peptide
Procedia PDF Downloads 1302872 Comparing the Embodied Carbon Impacts of a Passive House with the BC Energy Step Code Using Life Cycle Assessment
Authors: Lorena Polovina, Maddy Kennedy-Parrott, Mohammad Fakoor
Abstract:
The construction industry accounts for approximately 40% of total GHG emissions worldwide. In order to limit global warming to 1.5 degrees Celsius, ambitious reductions in the carbon intensity of our buildings are crucial. Passive House presents an opportunity to reduce operational carbon by as much as 90% compared to a traditional building through improving thermal insulation, limiting thermal bridging, increasing airtightness and heat recovery. Up until recently, Passive House design was mainly concerned with meeting the energy demands without considering embodied carbon. As buildings become more energy-efficient, embodied carbon becomes more significant. The main objective of this research is to calculate the embodied carbon impact of a Passive House and compare it with the BC Energy Step Code (ESC). British Columbia is committed to increasing the energy efficiency of buildings through the ESC, which is targeting net-zero energy-ready buildings by 2032. However, there is a knowledge gap in the embodied carbon impacts of more energy-efficient buildings, in particular Part 3 construction. In this case study, life cycle assessments (LCA) are performed on Part 3, a multi-unit residential building in Victoria, BC. The actual building is not constructed to the Passive House standard; however, the building envelope and mechanical systems are designed to comply with the Passive house criteria, as well as Steps 1 and 4 of the BC Energy Step Code (ESC) for comparison. OneClick LCA is used to perform the LCA of the case studies. Several strategies are also proposed to minimize the total carbon emissions of the building. The assumption is that there will not be significant differences in embodied carbon between a Passive House and a Step 4 building due to the building envelope.Keywords: embodied carbon, energy modeling, energy step code, life cycle assessment
Procedia PDF Downloads 1482871 TRACE/FRAPTRAN Analysis of Kuosheng Nuclear Power Plant Dry-Storage System
Authors: J. R. Wang, Y. Chiang, W. Y. Li, H. T. Lin, H. C. Chen, C. Shih, S. W. Chen
Abstract:
The dry-storage systems of nuclear power plants (NPPs) in Taiwan have become one of the major safety concerns. There are two steps considered in this study. The first step is the verification of the TRACE by using VSC-17 experimental data. The results of TRACE were similar to the VSC-17 data. It indicates that TRACE has the respectable accuracy in the simulation and analysis of the dry-storage systems. The next step is the application of TRACE in the dry-storage system of Kuosheng NPP (BWR/6). Kuosheng NPP is the second BWR NPP of Taiwan Power Company. In order to solve the storage of the spent fuels, Taiwan Power Company developed the new dry-storage system for Kuosheng NPP. In this step, the dry-storage system model of Kuosheng NPP was established by TRACE. Then, the steady state simulation of this model was performed and the results of TRACE were compared with the Kuosheng NPP data. Finally, this model was used to perform the safety analysis of Kuosheng NPP dry-storage system. Besides, FRAPTRAN was used tocalculate the transient performance of fuel rods.Keywords: BWR, TRACE, FRAPTRAN, dry-storage
Procedia PDF Downloads 5192870 On the Application and Comparison of Two Geostatistics Methods in the Parameterisation Step to Calibrate Groundwater Model: Grid-Based Pilot Point and Head-Zonation Based Pilot Point Methods
Authors: Dua K. S. Y. Klaas, Monzur A. Imteaz, Ika Sudiayem, Elkan M. E. Klaas, Eldav C. M. Klaas
Abstract:
Properly selecting the most suitable and effective geostatistics method in the parameterization step of groundwater modeling is critical to attain a satisfactory model. In this paper, two geostatistics methods, i.e., Grid-Based Pilot Point (GB-PP) and Head-Zonation Based Pilot Point (HZB-PP) methods, were applied in an eogenetic karst catchment and compared using as model performances and computation time the criteria. Overall, the results show that appropriate selection of method is substantial in the parameterization of physically-based groundwater models, as it influences both the accuracy and simulation times. It was found that GB-PP method performed comparably superior to HZB-PP method. However, reflecting its model performances, HZB-PP method is promising for further application in groundwater modeling.Keywords: groundwater model, geostatistics, pilot point, parameterization step
Procedia PDF Downloads 1662869 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 802868 Hypergraph for System of Systems modeling
Authors: Haffaf Hafid
Abstract:
Hypergraphs, after being used to model the structural organization of System of Sytems (SoS) at macroscopic level, has recent trends towards generalizing this powerful representation at different stages of complex system modelling. In this paper, we first describe different applications of hypergraph theory, and step by step, introduce multilevel modeling of SoS by means of integrating Constraint Programming Langages (CSP) dealing with engineering system reconfiguration strategy. As an application, we give an A.C.T Terminal controlled by a set of Intelligent Automated Vehicle.Keywords: hypergraph model, structural analysis, bipartite graph, monitoring, system of systems, reconfiguration analysis, hypernetwork
Procedia PDF Downloads 4882867 Recovery of Wastewater Treated of Boumerdes Step for Irrigation
Authors: N. Ouslimani, M. T. Abadlia, S. Yakoub, F. Tebbani
Abstract:
Water has always been synonymous with life and growth. Blue gold is first essential to the survival of the human being whose body consists of more than 65% with the development of industrialization and consumption patterns; volumes of wastewater discharges have increased considerably whether industrial or domestic, waste water must be purified before discharge. Treatment, therefore, aims to reduce the pollution load which contain. The resources in Algeria are limited and unevenly distributed. Thus, to meet all the water needs of the country and to preserve the waters of good quality drinking water supply, one solution would be to use them according to their quality and to irrigate crops for the food or be directed to the irrigation of green areas or sports complex. The purification performance of this STEP has been established since the pH analyzed pollution criteria (7.36) and temperature (16°C), MES (10 mg / l), electrical conductivity (1122 / µs / cm), DBO5 (6mg / l), DCO (15mg / l) meet the discharge standards. Arguably the purified water discharged out of the boumerdes STEP comply with Algerian regulations and can be reused in agriculture. COD biodegradability of the coefficient / BOD5 is 2.5 (less than 3) indicates that of the effluent are biodegradable hence their urban origin.Keywords: irrigation, recovery, treated, wastewater
Procedia PDF Downloads 2532866 The Role of Personality Characteristics and Psychological Harassment Behaviors Which Employees Are Exposed on Work Alienation
Authors: Hasan Serdar Öge, Esra Çiftçi, Kazım Karaboğa
Abstract:
The main purpose of the research is to address the role of psychological harassment behaviors (mobbing) to which employees are exposed and personality characteristics over work alienation. Research population was composed of the employees of Provincial Special Administration. A survey with four sections was created to measure variables and reach out the basic goals of the research. Correlation and step-wise regression analyses were performed to investigate the separate and overall effects of sub-dimensions of psychological harassment behaviors and personality characteristic on work alienation of employees. Correlation analysis revealed significant but weak relationships between work alienation and psychological harassment and personality characteristics. Step-wise regression analysis revealed also significant relationships between work alienation variable and assault to personality, direct negative behaviors (sub dimensions of mobbing) and openness (sub-dimension of personality characteristics). Each variable was introduced into the model step by step to investigate the effects of significant variables in explaining the variations in work alienation. While the explanation ratio of the first model was 13%, the last model including three variables had an explanation ratio of 24%.Keywords: alienation, five-factor personality characteristics, mobbing, psychological harassment, work alienation
Procedia PDF Downloads 4052865 Segmentation of the Liver and Spleen From Abdominal CT Images Using Watershed Approach
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The phase of segmentation is an important step in the processing and interpretation of medical images. In this paper, we focus on the segmentation of liver and spleen from the abdomen computed tomography (CT) images. The importance of our study comes from the fact that the segmentation of ROI from CT images is usually a difficult task. This difficulty is the gray’s level of which is similar to the other organ also the ROI are connected to the ribs, heart, kidneys, etc. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to remove the surrounding and connected organs and tissues by applying morphological filters. This first step makes the extraction of interest regions easier. The second step consists of improving the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce these deficiencies by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts.Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm
Procedia PDF Downloads 4952864 The Design of a Mixed Matrix Model for Activity Levels Extraction and Sub Processes Classification of a Work Project (Case: Great Tehran Electrical Distribution Company)
Authors: Elham Allahmoradi, Bahman Allahmoradi, Ali Bonyadi Naeini
Abstract:
Complex systems have many aspects. A variety of methods have been developed to analyze these systems. The most efficient of these methods should not only be simple, but also provide useful and comprehensive information about many aspects of the system. Matrix methods are considered the most commonly methods used to analyze and design systems. Each matrix method can examine a particular aspect of the system. If these methods are combined, managers can access to more comprehensive and broader information about the system. This study was conducted in four steps. In the first step, a process model of a real project has been extracted through IDEF3. In the second step, activity levels have been attained by writing a process model in the form of a design structure matrix (DSM) and sorting it through triangulation algorithm (TA). In the third step, sub-processes have been obtained by writing the process model in the form of an interface structure matrix (ISM) and clustering it through cluster identification algorithm (CIA). In the fourth step, a mixed model has been developed to provide a unified picture of the project structure through the simultaneous presentation of activities and sub-processes. Finally, the paper is completed with a conclusion.Keywords: integrated definition for process description capture (IDEF3) method, design structure matrix (DSM), interface structure matrix (ism), mixed matrix model, activity level, sub-process
Procedia PDF Downloads 4942863 Analysis of Exponential Distribution under Step Stress Partially Accelerated Life Testing Plan Using Adaptive Type-I Hybrid Progressive Censoring Schemes with Competing Risks Data
Authors: Ahmadur Rahman, Showkat Ahmad Lone, Ariful Islam
Abstract:
In this article, we have estimated the parameters for the failure times of units based on the sampling technique adaptive type-I progressive hybrid censoring under the step-stress partially accelerated life tests for competing risk. The failure times of the units are assumed to follow an exponential distribution. Maximum likelihood estimation technique is used to estimate the unknown parameters of the distribution and tampered coefficient. Confidence interval also obtained for the parameters. A simulation study is performed by using Monte Carlo Simulation method to check the authenticity of the model and its assumptions.Keywords: adaptive type-I hybrid progressive censoring, competing risks, exponential distribution, simulation, step-stress partially accelerated life tests
Procedia PDF Downloads 3432862 Failure Inference and Optimization for Step Stress Model Based on Bivariate Wiener Model
Authors: Soudabeh Shemehsavar
Abstract:
In this paper, we consider the situation under a life test, in which the failure time of the test units are not related deterministically to an observable stochastic time varying covariate. In such a case, the joint distribution of failure time and a marker value would be useful for modeling the step stress life test. The problem of accelerating such an experiment is considered as the main aim of this paper. We present a step stress accelerated model based on a bivariate Wiener process with one component as the latent (unobservable) degradation process, which determines the failure times and the other as a marker process, the degradation values of which are recorded at times of failure. Parametric inference based on the proposed model is discussed and the optimization procedure for obtaining the optimal time for changing the stress level is presented. The optimization criterion is to minimize the approximate variance of the maximum likelihood estimator of a percentile of the products’ lifetime distribution.Keywords: bivariate normal, Fisher information matrix, inverse Gaussian distribution, Wiener process
Procedia PDF Downloads 3172861 Environmental Pb-Free Cu Front Electrode for Si-Base Solar Cell Application
Authors: Wen-Hsi Lee, C.G. Kao
Abstract:
In this study, Cu paste was prepared and printed with narrow line screen printing process on polycrystalline Si solar cell which has already finished the back Al printing and deposition of double anti-reflection coatings (DARCs). Then, two-step firing process was applied to sinter the front electrode and obtain the ohmic contact between front electrode and solar cell. The first step was in air atmosphere. In this process, PbO-based glass frit etched the DARCs and Ag recrystallized at the surface of Si, constructing the preliminary contact. The second step was in reducing atmosphere. In this process, CuO reduced to Cu and sintered. Besides, Ag nanoparticles recrystallized in the glass layer at interface due to the interactions between H2, Ag and PbO-based glass frit and the volatility of Pb, constructing the ohmic contact between electrode and solar cell. By experiment and analysis, reaction mechanism in each stage was surmised, and it was also proven that ohmic contact and good sheet resistance for front electrode could both be obtained by applying newly-invented paste and process.Keywords: front electrode, solar cell, ohmic contact, screen printing, paste
Procedia PDF Downloads 3322860 Real-Time Control of Grid-Connected Inverter Based on labVIEW
Authors: L. Benbaouche, H. E. , F. Krim
Abstract:
In this paper we propose real-time control of grid-connected single phase inverter, which is flexible and efficient. The first step is devoted to the study and design of the controller through simulation, conducted by the LabVIEW software on the computer 'host'. The second step is running the application from PXI 'target'. LabVIEW software, combined with NI-DAQmx, gives the tools to easily build applications using the digital to analog converter to generate the PWM control signals. Experimental results show that the effectiveness of LabVIEW software applied to power electronics.Keywords: real-time control, labview, inverter, PWM
Procedia PDF Downloads 5092859 Recognition of Tifinagh Characters with Missing Parts Using Neural Network
Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui
Abstract:
In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN
Procedia PDF Downloads 3342858 Moderate Electric Field Influence on Carotenoids Extraction Time from Heterochlorella luteoviridis
Authors: Débora P. Jaeschke, Eduardo A. Merlo, Rosane Rech, Giovana D. Mercali, Ligia D. F. Marczak
Abstract:
Carotenoids are high value added pigments that can be alternatively extracted from some microalgae species. However, the application of carotenoids synthetized by microalgae is still limited due to the utilization of organic toxic solvents. In this context, studies involving alternative extraction methods have been conducted with more sustainable solvents to replace and reduce the solvent volume and the extraction time. The aim of the present work was to evaluate the extraction time of carotenoids from the microalgae Heterochlorella luteoviridis using moderate electric field (MEF) as a pre-treatment to the extraction. The extraction methodology consisted of a pre-treatment in the presence of MEF (180 V) and ethanol (25 %, v/v) for 10 min, followed by a diffusive step performed for 50 min using a higher ethanol concentration (75 %, v/v). The extraction experiments were conducted at 30 °C and, to keep the temperature at this value, it was used an extraction cell with a water jacket that was connected to a water bath. Also, to enable the evaluation of MEF effect on the extraction, control experiments were performed using the same cell and conditions without voltage application. During the extraction experiments, samples were withdrawn at 1, 5 and 10 min of the pre-treatment and at 1, 5, 30, 40 and 50 min of the diffusive step. Samples were, then, centrifuged and carotenoids analyses were performed in the supernatant. Furthermore, an exhaustive extraction with ethyl acetate and methanol was performed, and the carotenoids content found for this analyses was considered as the total carotenoids content of the microalgae. The results showed that the application of MEF as a pre-treatment to the extraction influenced the extraction yield and the extraction time during the diffusive step; after the MEF pre-treatment and 50 min of the diffusive step, it was possible to extract up to 60 % of the total carotenoids content. Also, results found for carotenoids concentration of the extracts withdrawn at 5 and 30 min of the diffusive step did not presented statistical difference, meaning that carotenoids diffusion occurs mainly in the very beginning of the extraction. On the other hand, the results for control experiments showed that carotenoids diffusion occurs mostly during 30 min of the diffusive step, which evidenced MEF effect on the extraction time. Moreover, carotenoids concentration on samples withdrawn during the pre-treatment (1, 5 and 10 min) were below the quantification limit of the analyses, indicating that the extraction occurred in the diffusive step, when ethanol (75 %, v/v) was added to the medium. It is possible that MEF promoted cell membrane permeabilization and, when ethanol (75 %) was added, carotenoids interacted with the solvent and the diffusion occurred easily. Based on the results, it is possible to infer that MEF promoted the decrease of carotenoids extraction time due to the increasing of the permeability of the cell membrane which facilitates the diffusion from the cell to the medium.Keywords: moderate electric field (MEF), pigments, microalgae, ethanol
Procedia PDF Downloads 4632857 Monodisperse Hallow Sandwich MOF for the Catalytic Oxidation of Benzene at Room Temperature
Authors: Srinivasapriyan Vijayan
Abstract:
Phenol is one of the most vital chemical in industry. Nowadays, phenol production is based upon the three-step cumene process, which involves a hazardous cumene hydroperoxide intermediate and produces nearly equimolar amounts of acetone as a coproduct. An attractive route in phenol production is the direct one-step selective hydroxylation of benzene using eco-friendly oxidants such as O2, N2O, and H2O2. In particular, the direct hydroxylation of benzene to form phenol with O2 has recently attracted extensive research attention because this process is green clean and eco-friendly. However, most of the catalytic systems involving O2 have a low rate of hydroxylation because the direct introduction of hydroxyl functionality into benzene is challenging. Almost all the developed catalytic systems require an elevated temperature and suffer from low conversion because of the notoriously low reactivity of aromatic C–H bonds. Moreover, increased reactivity of phenol relative to benzene makes the selective oxidation of benzene to phenol very difficult, especially under heating conditions. Hollow spheres, a very fascinating class of materials with good permeation and low density, highly monodisperse MOF hollow sandwich spheres have been rationally synthesized using monodisperse polystyrene (PS) nanoparticles as templates through a versatile step-by-step self-assembly strategy. So, our findings could pave the way toward highly efficient nonprecious catalysts for low-temperature oxidation reactions in heterogeneous catalysis. Because it is easy post-reaction separation, its cheap, green and recyclable.Keywords: benzene hydroxylation, Fe-based metal organic frameworks, molecular oxygen, phenol
Procedia PDF Downloads 2142856 Fabrication of Durable and Renegerable Superhydrophobic Coatings on Metallic Surfaces for Potential Industrial Applications
Authors: Priya Varshney, Soumya S. Mohapatra
Abstract:
Fabrication of anti-corrosion and self-cleaning superhydrophobic coatings for metallic surfaces which are regenerable and durable in the aggressive conditions has shown tremendous interest in materials science. In this work, the superhydrophobic coatings on metallic surfaces (aluminum, steel, copper) were prepared by two-step and one-step chemical etching process. In two-step process, roughness on surface was created by chemical etching and then passivation of roughened surface with low surface energy materials whereas, in one-step process, roughness on surface by chemical etching and passivation of surface with low surface energy materials were done in a single step. Beside this, the effect of etchant concentration and etching time on wettability and morphology was also studied. Thermal, mechanical, ultra-violet stability of these coatings were also tested. Along with this, regeneration of coatings and self-cleaning, corrosion resistance and water repelling characteristics were also studied. The surface morphology shows the presence of a rough microstuctures on the treated surfaces and the contact angle measurements confirms the superhydrophobic nature. It is experimentally observed that the surface roughness and contact angle increases with increase in etching time as well as with concentration of etchant. Superhydrophobic surfaces show the excellent self-cleaning behaviour. Coatings are found to be stable and maintain their superhydrophobicity in acidic and alkaline solutions. Water jet impact, floatation on water surface, and low temperature condensation tests prove the water-repellent nature of the coatings. These coatings are found to be thermal, mechanical and ultra-violet stable. These durable superhydrophobic metallic surfaces have potential industrial applications.Keywords: superhydrophobic, water-repellent, anti-corrosion, self-cleaning
Procedia PDF Downloads 2792855 Effects of Stirring Time and Reinforcement Preheating on the Porosity of Particulate Periwinkle Shell-Aluminium 6063 Metal Matrix Composite (PPS-ALMMC) Produced by Two-Step Casting
Authors: Reginald Umunakwe, Obinna Chibuzor Okoye, Uzoma Samuel Nwigwe, Damilare John Olaleye, Akinlabi Oyetunji
Abstract:
The potential for the development of PPS-AlMMCs as light weight material for industrial applications was investigated. Periwinkle shells were milled and the density of the particles determined. Particulate periwinkle shell of particle size 75µm was used to reinforce aluminium 6063 alloy at 10wt% filler loading using two-step stir casting technique. The composite materials were stirred for five minutes in a semi-solid state and the stirring time varied as 3, 6 and 9 minutes at above the liquidus temperature. A specimen was also produced with pre-heated filler. The effect of variation in stirring time and reinforcement pre-heating on the porosity of the composite materials was investigated. The results of the analysis show that a composition of reinforcement pre-heating and stirring for 3 minutes produced a composite material with the lowest porosity of 1.05%.Keywords: composites, periwinkle shell, two-step casting, porosity
Procedia PDF Downloads 3492854 Extraction of Text Subtitles in Multimedia Systems
Authors: Amarjit Singh
Abstract:
In this paper, a method for extraction of text subtitles in large video is proposed. The video data needs to be annotated for many multimedia applications. Text is incorporated in digital video for the motive of providing useful information about that video. So need arises to detect text present in video to understanding and video indexing. This is achieved in two steps. First step is text localization and the second step is text verification. The method of text detection can be extended to text recognition which finds applications in automatic video indexing; video annotation and content based video retrieval. The method has been tested on various types of videos.Keywords: video, subtitles, extraction, annotation, frames
Procedia PDF Downloads 6012853 Fuzzy Inference System for Risk Assessment Evaluation of Wheat Flour Product Manufacturing Systems
Authors: Yas Barzegaar, Atrin Barzegar
Abstract:
The aim of this research is to develop an intelligent system to analyze the risk level of wheat flour product manufacturing system. The model consists of five Fuzzy Inference Systems in two different layers to analyse the risk of a wheat flour product manufacturing system. The first layer of the model consists of four Fuzzy Inference Systems with three criteria. The output of each one of the Physical, Chemical, Biological and Environmental Failures will be the input of the final manufacturing systems. The proposed model based on Mamdani Fuzzy Inference Systems gives a performance ranking of wheat flour products manufacturing systems. The first step is obtaining data to identify the failure modes from expert’s opinions. The second step is the fuzzification process to convert crisp input to a fuzzy set., then the IF-then fuzzy rule applied through inference engine, and in the final step, the defuzzification process is applied to convert the fuzzy output into real numbers.Keywords: failure modes, fuzzy rules, fuzzy inference system, risk assessment
Procedia PDF Downloads 1022852 Numerical Regularization of Ill-Posed Problems via Hybrid Feedback Controls
Authors: Eugene Stepanov, Arkadi Ponossov
Abstract:
Many mathematical models used in biological and other applications are ill-posed. The reason for that is the nature of differential equations, where the nonlinearities are assumed to be step functions, which is done to simplify the analysis. Prominent examples are switched systems arising from gene regulatory networks and neural field equations. This simplification leads, however, to theoretical and numerical complications. In the presentation, it is proposed to apply the theory of hybrid feedback controls to regularize the problem. Roughly speaking, one attaches a finite state control (‘automaton’), which follows the trajectories of the original system and governs its dynamics at the points of ill-posedness. The construction of the automaton is based on the classification of the attractors of the specially designed adjoint dynamical system. This ‘hybridization’ is shown to regularize the original switched system and gives rise to efficient hybrid numerical schemes. Several examples are provided in the presentation, which supports the suggested analysis. The method can be of interest in other applied fields, where differential equations contain step-like nonlinearities.Keywords: hybrid feedback control, ill-posed problems, singular perturbation analysis, step-like nonlinearities
Procedia PDF Downloads 2452851 A Method for Clinical Concept Extraction from Medical Text
Authors: Moshe Wasserblat, Jonathan Mamou, Oren Pereg
Abstract:
Natural Language Processing (NLP) has made a major leap in the last few years, in practical integration into medical solutions; for example, extracting clinical concepts from medical texts such as medical condition, medication, treatment, and symptoms. However, training and deploying those models in real environments still demands a large amount of annotated data and NLP/Machine Learning (ML) expertise, which makes this process costly and time-consuming. We present a practical and efficient method for clinical concept extraction that does not require costly labeled data nor ML expertise. The method includes three steps: Step 1- the user injects a large in-domain text corpus (e.g., PubMed). Then, the system builds a contextual model containing vector representations of concepts in the corpus, in an unsupervised manner (e.g., Phrase2Vec). Step 2- the user provides a seed set of terms representing a specific medical concept (e.g., for the concept of the symptoms, the user may provide: ‘dry mouth,’ ‘itchy skin,’ and ‘blurred vision’). Then, the system matches the seed set against the contextual model and extracts the most semantically similar terms (e.g., additional symptoms). The result is a complete set of terms related to the medical concept. Step 3 –in production, there is a need to extract medical concepts from the unseen medical text. The system extracts key-phrases from the new text, then matches them against the complete set of terms from step 2, and the most semantically similar will be annotated with the same medical concept category. As an example, the seed symptom concepts would result in the following annotation: “The patient complaints on fatigue [symptom], dry skin [symptom], and Weight loss [symptom], which can be an early sign for Diabetes.” Our evaluations show promising results for extracting concepts from medical corpora. The method allows medical analysts to easily and efficiently build taxonomies (in step 2) representing their domain-specific concepts, and automatically annotate a large number of texts (in step 3) for classification/summarization of medical reports.Keywords: clinical concepts, concept expansion, medical records annotation, medical records summarization
Procedia PDF Downloads 135