Search results for: word processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4366

Search results for: word processing

1786 Towards a Large Scale Deep Semantically Analyzed Corpus for Arabic: Annotation and Evaluation

Authors: S. Alansary, M. Nagi

Abstract:

This paper presents an approach of conducting semantic annotation of Arabic corpus using the Universal Networking Language (UNL) framework. UNL is intended to be a promising strategy for providing a large collection of semantically annotated texts with formal, deep semantics rather than shallow. The result would constitute a semantic resource (semantic graphs) that is editable and that integrates various phenomena, including predicate-argument structure, scope, tense, thematic roles and rhetorical relations, into a single semantic formalism for knowledge representation. The paper will also present the Interactive Analysis​ tool for automatic semantic annotation (IAN). In addition, the cornerstone of the proposed methodology which are the disambiguation and transformation rules, will be presented. Semantic annotation using UNL has been applied to a corpus of 20,000 Arabic sentences representing the most frequent structures in the Arabic Wikipedia. The representation, at different linguistic levels was illustrated starting from the morphological level passing through the syntactic level till the semantic representation is reached. The output has been evaluated using the F-measure. It is 90% accurate. This demonstrates how powerful the formal environment is, as it enables intelligent text processing and search.

Keywords: semantic analysis, semantic annotation, Arabic, universal networking language

Procedia PDF Downloads 582
1785 Improving Monitoring and Fault Detection of Solar Panels Using Arduino Mega in WSN

Authors: Ali Al-Dahoud, Mohamed Fezari, Thamer Al-Rawashdeh, Ismail Jannoud

Abstract:

Monitoring and detecting faults on a set of Solar panels, using a wireless sensor network (WNS) is our contribution in this paper, This work is part of the project we are working on at Al-Zaytoonah University. The research problem has been exposed by engineers and technicians or operators dealing with PV panels maintenance, in order to monitor and detect faults within solar panels which affect considerably the energy produced by the solar panels. The proposed solution is based on installing WSN nodes with appropriate sensors for more often occurred faults on the 45 solar panels installed on the roof of IT faculty. A simulation has been done on nodes distribution and a study for the design of a node with appropriate sensors taking into account the priorities of the processing faults. Finally, a graphic user interface is designed and adapted to telemonitoring panels using WSN. The primary tests of hardware implementation gave interesting results, the sensors calibration and interference transmission problem have been solved. A friendly GUI using high level language Visial Basic was developed to carry out the monitoring process and to save data on Exel File.

Keywords: Arduino Mega microcnotroller, solar panels, fault-detection, simulation, node design

Procedia PDF Downloads 465
1784 Prevention of Biocompounds and Amino Acid Losses in Vernonia amygdalina duringPost Harvest Treatment Using Hot Oil-Aqueous Mixture

Authors: Nneka Nkechi Uchegbu, Temitope Omolayo Fasuan

Abstract:

This study investigated how to reduce bio-compounds and amino acids in V. amygdalina leaf during processing as a functional food ingredient. Fresh V. amygdalina leaf was processed using thermal oil-aqueous mixtures (soybean oil: aqueous and palm oil: aqueous) at 1:40 and 130 (v/v), respectively. Results indicated that the hot soybean oil-aqueous mixture was the most effective in preserving the bio-compounds and amino acids with retention potentials of 80.95% of the bio-compounds at the rate of 90-100%. Hot palm oil-aqueous mixture retained 61.90% of the bio-compounds at the rate of 90-100% and hot aqueous retained 9.52% of the bio-compounds at the same rate. During the debittering process, seven new bio-compounds were formed in the leaves treated with hot soybean oil-aqueous mixture, six in palm oil-aqueous mixture, and only four in hot aqueous leaves. The bio-compounds in the treated leaves have potential functions as antitumor, antioxidants, antihistaminic, anti-ovarian cancer, anti-inflammatory, antiarthritic, hepatoprotective, antihistaminic, haemolytic 5-α reductase inhibitor, nt, immune-stimulant, diuretic, antiandrogenic, and anaemiagenic. Alkaloids and polyphenols were retained at the rate of 81.34-98.50% using oil: aqueous mixture while aqueous recorded the rate of 33.47-41.46%. Most of the essential amino acids were retained at a rate above 90% through the aid of oil. The process is scalable and could be employed for domestic and industrial applications.

Keywords: V. amygdalina leaf, bio-compounds, oil-aqueous mixture, amino acids

Procedia PDF Downloads 146
1783 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping

Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton

Abstract:

Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.

Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern

Procedia PDF Downloads 182
1782 Effects of Bilingual Education in the Teaching and Learning Practices in the Continuous Improvement and Development of k12 Program

Authors: Miriam Sebastian

Abstract:

This research focused on the effects of bilingual education as medium of instruction to the academic performance of selected intermediate students of Miriam’s Academy of Valenzuela Inc. . An experimental design was used, with language of instruction as the independent variable and the different literacy skills as dependent variables. The sample consisted of experimental students comprises of 30 students were exposed to bilingual education (Filipino and English) . They were given pretests and were divided into three groups: Monolingual Filipino, Monolingual English, and Bilingual. They were taught different literacy skills for eight weeks and were then administered the posttests. Data was analyzed and evaluated in the light of the central processing and script-dependent hypotheses. Based on the data, it can be inferred that monolingual instruction in either Filipino or English had a stronger effect on the students’ literacy skills compared to bilingual instruction. Moreover, mother tongue-based instruction, as compared to second-language instruction, had stronger effect on the preschoolers’ literacy skills. Such results have implications not only for mother tongue-based (MTB) but also for English as a second language (ESL) instruction in the country

Keywords: bilingualism, effects, monolingual, function, multilingual, mother tongue

Procedia PDF Downloads 127
1781 Processing and Characterization of Glass-Epoxy Composites Filled with Linz-Donawitz (LD) Slag

Authors: Pravat Ranjan Pati, Alok Satapathy

Abstract:

Linz-Donawitz (LD) slag a major solid waste generated in huge quantities during steel making. It comes from slag formers such as burned lime/dolomite and from oxidizing of silica, iron etc. while refining the iron into steel in the LD furnace. Although a number of ways for its utilization have been suggested, its potential as a filler material in polymeric matrices has not yet been explored. The present work reports the possible use of this waste in glass fiber reinforced epoxy composites as a filler material. Hybrid composites consisting of bi-directional e-glass-fiber reinforced epoxy filled with different LD slag content (0, 7.5, 15, 22.5 wt%) are prepared by simple hand lay-up technique. The composites are characterized in regard to their density, porosity, micro-hardness and strength properties. X-ray diffractography is carried out in order to ascertain the various phases present in LDS. This work shows that LD slag, in spite of being a waste, possesses fairly good filler characteristics as it modifies the strength properties and improves the composite micro-hardness of the polymeric resin.

Keywords: characterization, glass-epoxy composites, LD slag, waste utilization

Procedia PDF Downloads 392
1780 Hyperspectral Band Selection for Oil Spill Detection Using Deep Neural Network

Authors: Asmau Mukhtar Ahmed, Olga Duran

Abstract:

Hydrocarbon (HC) spills constitute a significant problem that causes great concern to the environment. With the latest technology (hyperspectral images) and state of the earth techniques (image processing tools), hydrocarbon spills can easily be detected at an early stage to mitigate the effects caused by such menace. In this study; a controlled laboratory experiment was used, and clay soil was mixed and homogenized with different hydrocarbon types (diesel, bio-diesel, and petrol). The different mixtures were scanned with HYSPEX hyperspectral camera under constant illumination to generate the hypersectral datasets used for this experiment. So far, the Short Wave Infrared Region (SWIR) has been exploited in detecting HC spills with excellent accuracy. However, the Near-Infrared Region (NIR) is somewhat unexplored with regards to HC contamination and how it affects the spectrum of soils. In this study, Deep Neural Network (DNN) was applied to the controlled datasets to detect and quantify the amount of HC spills in soils in the Near-Infrared Region. The initial results are extremely encouraging because it indicates that the DNN was able to identify features of HC in the Near-Infrared Region with a good level of accuracy.

Keywords: hydrocarbon, Deep Neural Network, short wave infrared region, near-infrared region, hyperspectral image

Procedia PDF Downloads 112
1779 Optical Flow Based System for Cross Traffic Alert

Authors: Giuseppe Spampinato, Salvatore Curti, Ivana Guarneri, Arcangelo Bruna

Abstract:

This document describes an advanced system and methodology for Cross Traffic Alert (CTA), able to detect vehicles that move into the vehicle driving path from the left or right side. The camera is supposed to be not only on a vehicle still, e.g. at a traffic light or at an intersection, but also moving slowly, e.g. in a car park. In all of the aforementioned conditions, a driver’s short loss of concentration or distraction can easily lead to a serious accident. A valid support to avoid these kinds of car crashes is represented by the proposed system. It is an extension of our previous work, related to a clustering system, which only works on fixed cameras. Just a vanish point calculation and simple optical flow filtering, to eliminate motion vectors due to the car relative movement, is performed to let the system achieve high performances with different scenarios, cameras and resolutions. The proposed system just uses as input the optical flow, which is hardware implemented in the proposed platform and since the elaboration of the whole system is really speed and power consumption, it is inserted directly in the camera framework, allowing to execute all the processing in real-time.

Keywords: clustering, cross traffic alert, optical flow, real time, vanishing point

Procedia PDF Downloads 203
1778 Microarray Data Visualization and Preprocessing Using R and Bioconductor

Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava

Abstract:

Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.

Keywords: microarray analysis, R language, affymetrix visualization, bioconductor

Procedia PDF Downloads 480
1777 Assessment of the Contribution of Geographic Information System Technology in Non Revenue Water: Case Study Dar Es Salaam Water and Sewerage Authority Kawe - Mzimuni Street

Authors: Victor Pesco Kassa

Abstract:

This research deals with the assessment of the contribution of GIS Technology in NRW. This research was conducted at Dar, Kawe Mzimuni Street. The data collection was obtained from existing source which is DAWASA HQ. The interpretation of the data was processed by using ArcGIS software. The data collected from the existing source reveals a good coverage of DAWASA’s water network at Mzimuni Street. Most of residents are connected to the DAWASA’s customer service. Also the collected data revealed that by using GIS DAWASA’s customer Geodatabase has been improved. Through GIS we can prepare customer location map purposely for site surveying also this map will be able to show different type of customer that are connected to DAWASA’s water service. This is a perfect contribution of the GIS Technology to address and manage the problem of NRW in DAWASA. Finally, the study recommends that the same study should be conducted in other DAWASA’s zones such as Temeke, Boko and Bagamoyo not only at Kawe Mzimuni Street. Through this study it is observed that ArcGIS software can offer powerful tools for managing and processing information geographically and in water and sanitation authorities such as DAWASA.

Keywords: DAWASA, NRW, Esri, EURA, ArcGIS

Procedia PDF Downloads 83
1776 Improvement of Bone Scintography Image Using Image Texture Analysis

Authors: Yousif Mohamed Y. Abdallah, Eltayeb Wagallah

Abstract:

Image enhancement allows the observer to see details in images that may not be immediately observable in the original image. Image enhancement is the transformation or mapping of one image to another. The enhancement of certain features in images is accompanied by undesirable effects. To achieve maximum image quality after denoising, a new, low order, local adaptive Gaussian scale mixture model and median filter were presented, which accomplishes nonlinearities from scattering a new nonlinear approach for contrast enhancement of bones in bone scan images using both gamma correction and negative transform methods. The usual assumption of a distribution of gamma and Poisson statistics only lead to overestimation of the noise variance in regions of low intensity but to underestimation in regions of high intensity and therefore to non-optional results. The contrast enhancement results were obtained and evaluated using MatLab program in nuclear medicine images of the bones. The optimal number of bins, in particular the number of gray-levels, is chosen automatically using entropy and average distance between the histogram of the original gray-level distribution and the contrast enhancement function’s curve.

Keywords: bone scan, nuclear medicine, Matlab, image processing technique

Procedia PDF Downloads 507
1775 Emotion Recognition Using Artificial Intelligence

Authors: Rahul Mohite, Lahcen Ouarbya

Abstract:

This paper focuses on the interplay between humans and computer systems and the ability of these systems to understand and respond to human emotions, including non-verbal communication. Current emotion recognition systems are based solely on either facial or verbal expressions. The limitation of these systems is that it requires large training data sets. The paper proposes a system for recognizing human emotions that combines both speech and emotion recognition. The system utilizes advanced techniques such as deep learning and image recognition to identify facial expressions and comprehend emotions. The results show that the proposed system, based on the combination of facial expression and speech, outperforms existing ones, which are based solely either on facial or verbal expressions. The proposed system detects human emotion with an accuracy of 86%, whereas the existing systems have an accuracy of 70% using verbal expression only and 76% using facial expression only. In this paper, the increasing significance and demand for facial recognition technology in emotion recognition are also discussed.

Keywords: facial reputation, expression reputation, deep gaining knowledge of, photo reputation, facial technology, sign processing, photo type

Procedia PDF Downloads 121
1774 Reading Knowledge Development and Its Phases with Generation Z

Authors: Onur Özdemir, M.Erhan ORHAN

Abstract:

Knowledge Development (KD) is just one of the important phases of Knowledge Management (KM). KD is the phase in which intelligence is used to see the big picture. In order to understand whether information is important or not, we have to use the intelligence cycle that includes four main steps: aiming, collecting data, processing and utilizing. KD also needs these steps. To make a precise decision, the decision maker has to be aware of his subordinates’ ideas. If the decision maker ignores the ideas of his subordinates or participants of the organization, it is not possible for him to get the target. KD is a way of using wisdom to accumulate the puzzle. If the decision maker does not bring together the puzzle pieces, he cannot get the big picture, and this shows its effects on the battlefield. In order to understand the battlefield, the decision maker has to use the intelligence cycle. To convert information to knowledge, KD is the main means for the intelligence cycle. On the other hand, the “Z Generation” born after the millennium are really the game changers. They have different attitudes from their elders. Their understanding of life is different - the definition of freedom and independence have different meanings to them than others. Decision makers have to consider these factors and rethink their decisions accordingly. This article tries to explain the relation between KD and Generation Z. KD is the main method of target managing. But if leaders neglect their people, the world will be seeing much more movements like the Arab Spring and other insurgencies.

Keywords: knowledge development, knowledge management, generation Z, intelligence cycle

Procedia PDF Downloads 517
1773 Catalytic Decomposition of High Energy Materials Using Nanoparticles of Copper Chromite

Authors: M. Sneha Reddy, M. Arun Kumar, V. Kameswara Rao

Abstract:

Chromites are binary transition metal oxides with a general formula of ACr₂O₄, where A = Mn²⁺, Fe²⁺, Co²⁺, Ni²⁺, and Cu²⁺. Chromites have a normal-type spinel structure with interesting applications in the areas of applied physics, material sciences, and geophysics. They have attracted great consideration because of their unique physicochemical properties and tremendous technological applications in nanodevices, sensor elements, and high-temperature ceramics with useful optical properties. Copper chromite is one of the most efficient spinel oxides, having pronounced commercial application as a catalyst in various chemical reactions like oxidation, hydrogenation, alkylation, dehydrogenation, decomposition of organic compounds, and hydrogen production. Apart from its usage in chemical industries, CuCr₂O₄ finds its major application as a burn rate modifier in solid propellant processing for space launch vehicles globally. Herein we synthesized the nanoparticles of copper chromite using the co-precipitation method. The synthesized nanoparticles were characterized by XRD, TEM, SEM, BET, and TG-DTA. The synthesized nanoparticles of copper chromites were used as a catalyst for the thermal decomposition of various high-energy materials.

Keywords: copper chromite, coprecipitation method, high energy materials, catalytic thermal decomposition

Procedia PDF Downloads 77
1772 A Systamatic Review on Experimental, FEM Analysis and Simulation of Metal Spinning Process

Authors: Amol M. Jadhav, Sharad S. Chudhari, S. S. Khedkar

Abstract:

This review presents a through survey of research paper work on the experimental analysis, FEM Analysis & simulation of the metal spinning process. In this literature survey all the papers being taken from Elsevier publication and most of the from journal of material processing technology. In a last two decade or so, metal spinning process gradually used as chip less formation for the production of engineering component in a small to medium batch quantities. The review aims to provide include into the experimentation, FEM analysis of various components, simulation of metal spinning process and act as guide for research working on metal spinning processes. The review of existing work has several gaps in current knowledge of metal spinning processes. The evaluation of experiment is thickness strain, the spinning force, the twisting angle, the surface roughness of the conventional & shear metal spinning process; the evaluation of FEM of metal spinning to path definition with sufficient fine mesh to capture behavior of work piece; The evaluation of feed rate of roller, direction of roller,& type of roller stimulated. The metal spinning process has the more flexible to produce a wider range of product shape & to form more challenge material.

Keywords: metal spinning, FEM analysis, simulation of metal spinning, mechanical engineering

Procedia PDF Downloads 387
1771 Computational Fluid Dynamics Simulations of Thermal and Flow Fields inside a Desktop Personal Computer Cabin

Authors: Mohammad Salehi, Mohammad Erfan Doraki

Abstract:

In this paper, airflow analysis inside a desktop computer case is performed by simulating computational fluid dynamics. The purpose is to investigate the cooling process of the central processing unit (CPU) with thermal capacities of 80 and 130 watts. The airflow inside the computer enclosure, selected from the microATX model, consists of the main components of heat production such as CPU, hard disk drive, CD drive, floppy drive, memory card and power supply unit; According to the amount of thermal power produced by the CPU with 80 and 130 watts of power, two different geometries have been used for a direct and radial heat sink. First, the independence of the computational mesh and the validation of the solution were performed, and after ensuring the correctness of the numerical solution, the results of the solution were analyzed. The simulation results showed that changes in CPU temperature and other components linearly increased with increasing CPU heat output. Also, the ambient air temperature has a significant effect on the maximum processor temperature.

Keywords: computational fluid dynamics, CPU cooling, computer case simulation, heat sink

Procedia PDF Downloads 122
1770 The Capacity of Mel Frequency Cepstral Coefficients for Speech Recognition

Authors: Fawaz S. Al-Anzi, Dia AbuZeina

Abstract:

Speech recognition is of an important contribution in promoting new technologies in human computer interaction. Today, there is a growing need to employ speech technology in daily life and business activities. However, speech recognition is a challenging task that requires different stages before obtaining the desired output. Among automatic speech recognition (ASR) components is the feature extraction process, which parameterizes the speech signal to produce the corresponding feature vectors. Feature extraction process aims at approximating the linguistic content that is conveyed by the input speech signal. In speech processing field, there are several methods to extract speech features, however, Mel Frequency Cepstral Coefficients (MFCC) is the popular technique. It has been long observed that the MFCC is dominantly used in the well-known recognizers such as the Carnegie Mellon University (CMU) Sphinx and the Markov Model Toolkit (HTK). Hence, this paper focuses on the MFCC method as the standard choice to identify the different speech segments in order to obtain the language phonemes for further training and decoding steps. Due to MFCC good performance, the previous studies show that the MFCC dominates the Arabic ASR research. In this paper, we demonstrate MFCC as well as the intermediate steps that are performed to get these coefficients using the HTK toolkit.

Keywords: speech recognition, acoustic features, mel frequency, cepstral coefficients

Procedia PDF Downloads 259
1769 Petra: Simplified, Scalable Verification Using an Object-Oriented, Compositional Process Calculus

Authors: Aran Hakki, Corina Cirstea, Julian Rathke

Abstract:

Formal methods are yet to be utilized in mainstream software development due to issues in scaling and implementation costs. This work is about developing a scalable, simplified, pragmatic, formal software development method with strong correctness properties and guarantees that are easy prove. The method aims to be easy to learn, use and apply without extensive training and experience in formal methods. Petra is proposed as an object-oriented, process calculus with composable data types and sequential/parallel processes. Petra has a simple denotational semantics, which includes a definition of Correct by Construction. The aim is for Petra is to be standard which can be implemented to execute on various mainstream programming platforms such as Java. Work towards an implementation of Petra as a Java EDSL (Embedded Domain Specific Language) is also discussed.

Keywords: compositionality, formal method, software verification, Java, denotational semantics, rewriting systems, rewriting semantics, parallel processing, object-oriented programming, OOP, programming language, correct by construction

Procedia PDF Downloads 144
1768 SnSₓ, Cu₂ZnSnS₄ Nanostructured Thin Layers for Thin-Film Solar Cells

Authors: Elena A. Outkina, Marina V. Meledina, Aliaksandr A. Khodin

Abstract:

Nanostructured thin films of SnSₓ, Cu₂ZnSnS₄ (CZTS) semiconductors were fabricated by chemical processing to produce thin-film photoactive layers for photocells as a prospective lowest-cost and environment-friendly alternative to Si, Cu(In, Ga)Se₂, and other traditional solar cells materials. To produce SnSₓ layers, the modified successive ionic layer adsorption and reaction (SILAR) technique were investigated, including successive cyclic dipping into Na₂S solution and SnCl₂, NaCl, triethanolamine solution. To fabricate CZTS layers, the cyclic dipping into CuSO₄ with ZnSO₄, SnCl₂, and Na₂S solutions was used with intermediate rinsing in distilled water. The nano-template aluminum/alumina substrate was used to control deposition processes. Micromorphology and optical characteristics of the fabricated layers have been investigated. Analysis of 2D-like layers deposition features using nano-template substrate is presented, including the effect of nanotips in a template on surface charge redistribution and transport.

Keywords: kesterite, nanotemplate, SILAR, solar cell, tin sulphide

Procedia PDF Downloads 142
1767 Active Development of Tacit Knowledge: Knowledge Management, High Impact Practices and Experiential Learning

Authors: John Zanetich

Abstract:

Due to their positive associations with student learning and retention, certain undergraduate opportunities are designated ‘high-impact.’ High-Impact Practices (HIPs) such as, learning communities, community based projects, research, internships, study abroad and culminating senior experience, share several traits bin common: they demand considerable time and effort, learning occurs outside of the classroom, and they require meaningful interactions between faculty and students, they encourage collaboration with diverse others, and they provide frequent and substantive feedback. As a result of experiential learning in these practices, participation in these practices can be life changing. High impact learning helps individuals locate tacit knowledge, and build mental models that support the accumulation of knowledge. On-going learning from experience and knowledge conversion provides the individual with a way to implicitly organize knowledge and share knowledge over a lifetime. Knowledge conversion is a knowledge management component which focuses on the explication of the tacit knowledge that exists in the minds of students and that knowledge which is embedded in the process and relationships of the classroom educational experience. Knowledge conversion is required when working with tacit knowledge and the demand for a learner to align deeply held beliefs with the cognitive dissonance created by new information. Knowledge conversion and tacit knowledge result from the fact that an individual's way of knowing, that is, their core belief structure, is considered generalized and tacit instead of explicit and specific. As a phenomenon, tacit knowledge is not readily available to the learner for explicit description unless evoked by an external source. The development of knowledge–related capabilities such as Aggressive Development of Tacit Knowledge (ADTK) can be used in experiential educational programs to enhance knowledge, foster behavioral change, improve decision making, and overall performance. ADTK allows the student in HIPs to use their existing knowledge in a way that allows them to evaluate and make any necessary modifications to their core construct of reality in order to amalgamate new information. Based on the Lewin/Schein Change Theory, the learner will reach for tacit knowledge as a stabilizing mechanism when they are challenged by new information that puts them slightly off balance. As in word association drills, the important concept is the first thought. The reactionary outpouring to an experience is the programmed or tacit memory and knowledge of their core belief structure. ADTK is a way to help teachers design their own methods and activities to unfreeze, create new learning, and then refreeze the core constructs upon which future learning in a subject area is built. This paper will explore the use of ADTK as a technique for knowledge conversion in the classroom in general and in HIP programs specifically. It will focus on knowledge conversion in curriculum development and propose the use of one-time educational experiences, multi-session experiences and sequential program experiences focusing on tacit knowledge in educational programs.

Keywords: tacit knowledge, knowledge management, college programs, experiential learning

Procedia PDF Downloads 262
1766 Teachers' and Learners' Experiences of Learners' Writing in English First Additional Language

Authors: Jane-Francis A. Abongdia, Thandiswa Mpiti

Abstract:

There is an international concern to develop children’s literacy skills. In many parts of the world, the need to become fluent in a second language is essential for gaining meaningful access to education, the labour market and broader social functioning. In spite of these efforts, the problem still continues. The level of English language proficiency is far from satisfactory and these goals are unattainable by others. The issue is more complex in South Africa as learners are immersed in a second language (L2) curriculum. South Africa is a prime example of a country facing the dilemma of how to effectively equip a majority of its population with English as a second language or first additional language (FAL). Given the multilingual nature of South Africa with eleven official languages, and the position and power of English, the study investigates teachers’ and learners’ experiences on isiXhosa and Afrikaans background learners’ writing in English First Additional Language (EFAL). Moreover, possible causes of writing difficulties and teacher’s practices for writing are explored. The theoretical and conceptual framework for the study is provided by studies on constructivist theories and sociocultural theories. In exploring these issues, a qualitative approach through semi-structured interviews, classroom observations, and document analysis were adopted. This data is analysed by critical discourse analysis (CDA). The study identified a weak correlation between teachers’ beliefs and their actual teaching practices. Although the teachers believe that writing is as important as listening, speaking, reading, grammar and vocabulary, and that it needs regular practice, the data reveal that they fail to put their beliefs into practice. Moreover, the data revealed that learners were disturbed by their home language because when they do not know a word they would write either the isiXhosa or the Afrikaans equivalent. Code-switching seems to have instilled a sense of “dependence on translations” where some learners would not even try to answer English questions but would wait for the teacher to translate the questions into isiXhosa or Afrikaans before they could attempt to give answers. The findings of the study show a marked improvement in the writing performance of learners who used the process approach in writing. These findings demonstrate the need for assisting teachers to shift away from focusing only on learners’ performance (testing and grading) towards a stronger emphasis on the process of writing. The study concludes that the process approach to writing could enable teachers to focus on the various parts of the writing process which can give more freedom to learners to experiment their language proficiency. It would require that teachers develop a deeper understanding of the process/genre approaches to teaching writing advocated by CAPS. All in all, the study shows that both learners and teachers face numerous challenges relating to writing. This means that more work still needs to be done in this area. The present study argues that teachers teaching EFAL learners should approach writing as a critical and core aspect of learners’ education. Learners should be exposed to intensive writing activities throughout their school years.

Keywords: constructivism, English second language, language of learning and teaching, writing

Procedia PDF Downloads 218
1765 Life Cycle-Based Analysis of Meat Production: Ecosystem Impacts

Authors: Michelle Zeyuan Ma, Hermann Heilmeier

Abstract:

Recently, meat production ecosystem impacts initiated many hot discussions and researchers, and it is a difficult implementation to reduce such impacts due to the demand of meat products. It calls for better management and control of ecosystem impacts from every aspects of meat production. This article analyzes the ecosystem impacts of meat production based on meat products life cycle. The analysis shows that considerable ecosystem impacts are caused by different meat production steps: initial establishment phase, animal raising, slaughterhouse processing, meat consumption, and wastes management. Based on this analysis, the impacts are summarized as: leading factor for biodiversity loss; water waste, land use waste and land degradation; greenhouse gases emissions; pollution to air, water, and soil; related major diseases. The article also provides a discussion on a solution-sustainable food system, which could help in reducing ecosystem impacts. The analysis method is based on the life cycle level, it provides a concept of the whole meat industry ecosystem impacts, and the analysis result could be useful to manage or control meat production ecosystem impacts from investor, producer and consumer sides.

Keywords: eutrophication, life cycle based analysis, sustainable food, waste management

Procedia PDF Downloads 220
1764 Multi-Channel Charge-Coupled Device Sensors Real-Time Cell Growth Monitor System

Authors: Han-Wei Shih, Yao-Nan Wang, Ko-Tung Chang, Lung-Ming Fu

Abstract:

A multi-channel cell growth real-time monitor and evaluation system using charge-coupled device (CCD) sensors with 40X lens integrating a NI LabVIEW image processing program is proposed and demonstrated. The LED light source control of monitor system is utilizing 8051 microprocessor integrated with NI LabVIEW software. In this study, the same concentration RAW264.7 cells growth rate and morphology in four different culture conditions (DMEM, LPS, G1, G2) were demonstrated. The real-time cells growth image was captured and analyzed by NI Vision Assistant every 10 minutes in the incubator. The image binarization technique was applied for calculating cell doubling time and cell division index. The cells doubling time and cells division index of four group with DMEM, LPS, LPS+G1, LPS+G2 are 12.3 hr,10.8 hr,14.0 hr,15.2 hr and 74.20%, 78.63%, 69.53%, 66.49%. The image magnification of multi-channel CCDs cell real-time monitoring system is about 100X~200X which compares with the traditional microscope.

Keywords: charge-coupled device (CCD), RAW264.7, doubling time, division index

Procedia PDF Downloads 358
1763 Enhancing Spatial Interpolation: A Multi-Layer Inverse Distance Weighting Model for Complex Regression and Classification Tasks in Spatial Data Analysis

Authors: Yakin Hajlaoui, Richard Labib, Jean-François Plante, Michel Gamache

Abstract:

This study introduces the Multi-Layer Inverse Distance Weighting Model (ML-IDW), inspired by the mathematical formulation of both multi-layer neural networks (ML-NNs) and Inverse Distance Weighting model (IDW). ML-IDW leverages ML-NNs' processing capabilities, characterized by compositions of learnable non-linear functions applied to input features, and incorporates IDW's ability to learn anisotropic spatial dependencies, presenting a promising solution for nonlinear spatial interpolation and learning from complex spatial data. it employ gradient descent and backpropagation to train ML-IDW, comparing its performance against conventional spatial interpolation models such as Kriging and standard IDW on regression and classification tasks using simulated spatial datasets of varying complexity. the results highlight the efficacy of ML-IDW, particularly in handling complex spatial datasets, exhibiting lower mean square error in regression and higher F1 score in classification.

Keywords: deep learning, multi-layer neural networks, gradient descent, spatial interpolation, inverse distance weighting

Procedia PDF Downloads 52
1762 Identification of Text Domains and Register Variation through the Analysis of Lexical Distribution in a Bangla Mass Media Text Corpus

Authors: Mahul Bhattacharyya, Niladri Sekhar Dash

Abstract:

The present research paper is an experimental attempt to investigate the nature of variation in the register in three major text domains, namely, social, cultural, and political texts collected from the corpus of Bangla printed mass media texts. This present study uses a corpus of a moderate amount of Bangla mass media text that contains nearly one million words collected from different media sources like newspapers, magazines, advertisements, periodicals, etc. The analysis of corpus data reveals that each text has certain lexical properties that not only control their identity but also mark their uniqueness across the domains. At first, the subject domains of the texts are classified into two parameters namely, ‘Genre' and 'Text Type'. Next, some empirical investigations are made to understand how the domains vary from each other in terms of lexical properties like both function and content words. Here the method of comparative-cum-contrastive matching of lexical load across domains is invoked through word frequency count to track how domain-specific words and terms may be marked as decisive indicators in the act of specifying the textual contexts and subject domains. The study shows that the common lexical stock that percolates across all text domains are quite dicey in nature as their lexicological identity does not have any bearing in the act of specifying subject domains. Therefore, it becomes necessary for language users to anchor upon certain domain-specific lexical items to recognize a text that belongs to a specific text domain. The eventual findings of this study confirm that texts belonging to different subject domains in Bangla news text corpus clearly differ on the parameters of lexical load, lexical choice, lexical clustering, lexical collocation. In fact, based on these parameters, along with some statistical calculations, it is possible to classify mass media texts into different types to mark their relation with regard to the domains they should actually belong. The advantage of this analysis lies in the proper identification of the linguistic factors which will give language users a better insight into the method they employ in text comprehension, as well as construct a systemic frame for designing text identification strategy for language learners. The availability of huge amount of Bangla media text data is useful for achieving accurate conclusions with a certain amount of reliability and authenticity. This kind of corpus-based analysis is quite relevant for a resource-poor language like Bangla, as no attempt has ever been made to understand how the structure and texture of Bangla mass media texts vary due to certain linguistic and extra-linguistic constraints that are actively operational to specific text domains. Since mass media language is assumed to be the most 'recent representation' of the actual use of the language, this study is expected to show how the Bangla news texts reflect the thoughts of the society and how they leave a strong impact on the thought process of the speech community.

Keywords: Bangla, corpus, discourse, domains, lexical choice, mass media, register, variation

Procedia PDF Downloads 174
1761 Exfoliation of Functionalized High Structural Integrity Graphene Nanoplatelets at Extremely Low Temperature

Authors: Mohannad N. H. Al-Malichi

Abstract:

Because of its exceptional properties, graphene has become the most promising nanomaterial for the development of a new generation of advanced materials from battery electrodes to structural composites. However, current methods to meet requirements for the mass production of high-quality graphene are limited by harsh oxidation, high temperatures, and tedious processing steps. To extend the scope of the bulk production of graphene, herein, a facile, reproducible and cost-effective approach has been developed. This involved heating a specific mixture of chemical materials at an extremely low temperature (70 C) for a short period (7 minutes) to exfoliate functionalized graphene platelets with high structural integrity. The obtained graphene platelets have an average thickness of 3.86±0.71 nm and a lateral size less than ~2 µm with a low defect intensity ID/IG ~0.06. The thin film (~2 µm thick) exhibited a low surface resistance of ~0.63 Ω/sq⁻¹, confirming its high electrical conductivity. Additionally, these nanoplatelets were decorated with polar functional groups (epoxy and carboxyl groups), thus have the potential to toughen and provide multifunctional polymer nanocomposites. Moreover, such a simple method can be further exploited for the novel exfoliation of other layered two-dimensional materials such as MXenes.

Keywords: functionalized graphene nanoplatelets, high structural integrity graphene, low temperature exfoliation of graphene, functional graphene platelets

Procedia PDF Downloads 120
1760 Material Characterization of Medical Grade Woven Bio-Fabric for Use in ABAQUS *FABRIC Material Model

Authors: Lewis Wallace, William Dempster, David Nash, Alexandros Boukis, Craig Maclean

Abstract:

This paper, through traditional test methods and close adherence to international standards, presents a characterization study of a woven Polyethylene Terephthalate (PET). Testing is undergone in the axial, shear, and out-of-plane (bend) directions, and the results are fitted to the *FABRIC material model with ABAQUS FEA. The non-linear behaviors of the fabric in the axial and shear directions and behaviors on the macro scale are explored at the meso scale level. The medical grade bio-fabric is tested in untreated and heat-treated forms, and deviations are closely analyzed at the micro, meso, and macro scales to determine the effects of the process. The heat-treatment process was found to increase the stiffness of the fabric during axial and bending stiffness testing but had a negligible effect on the shear response. The ability of *FABRIC to capture behaviors unique to fabric deformation is discussed, whereby the unique phenomenological input can accurately represent the experimentally derived inputs.

Keywords: experimental techniques, FEA modelling, materials characterization, post-processing techniques

Procedia PDF Downloads 95
1759 Analyzing the Market Growth in Application Programming Interface Economy Using Time-Evolving Model

Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata

Abstract:

API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding the dynamics of a market of API economy under the strategies of participants is crucial to fully maximize the values of the API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competition, and the profit of the platform increases.

Keywords: API economy, ecosystem, platform, API providers

Procedia PDF Downloads 91
1758 The Staff Performance Efficiency of the Faculty of Management Science, Suan Sunandha Rajabhat University

Authors: Nipawan Tharasak, Ladda Hirunyava

Abstract:

The objective of the research was to study factors affecting working efficiency and the relationship between working environment, satisfaction to human resources management and operation employees’ working efficiency of Faculty of Management Science, Suan Sunandha Rajabhat University. The sample size of the research was based on 33 employees of Faculty of Management Science. The researcher had classified the support employees into 4 divisions by using Stratified Random Sampling. Individual sample was randomized by using Simple Random Sampling. Data was collected through the instrument. The Statistical Package for the Windows was utilized for data processing. Percentage, mean, standard deviation, the t-test, One-way ANOVA, and Pearson product moment correlation coefficient were applied. The result found the support employees’ satisfaction in human resources management of Faculty of Management Science in following areas: remuneration; employee recruitment & selection; manpower planning; performance evaluation; staff training & developing; and spirit & fairness were overall in good level.

Keywords: faculty of management science, operational factors, practice performance, staff working

Procedia PDF Downloads 235
1757 Quaternary Ammonium Salts Based Algerian Petroleum Products: Synthesis and Characterization

Authors: Houria Hamitouche, Abdellah Khelifa

Abstract:

Quaternary ammonium salts (QACs) are the most common cationic surfactants of natural or synthetic origin usually. They possess one or more hydrophobic hydrocarbon chains and hydrophilic cationic group. In fact, the hydrophobic groups are derived from three main sources: petrochemicals, vegetable oils, and animal fats. These QACs have attracted the attention of chemists for a long time, due to their general simple synthesis and their broad application in several fields. They are important as ingredients of cosmetic products and are also used as corrosion inhibitors, in emulsion polymerization and textile processing. Within biological applications, QACs show a good antimicrobial activity and can be used as medicines, gene delivery agents or in DNA extraction methods. The 2004 worldwide annual consumption of QACs was reported as 500,000 tons. The petroleum product is considered a true reservoir of a variety of chemical species, which can be used in the synthesis of quaternary ammonium salts. The purpose of the present contribution is to synthesize the quaternary ammonium salts by Menschutkin reaction, via chloromethylation/quaternization sequences, from Algerian petroleum products namely: reformate, light naphtha and kerosene and characterize.

Keywords: quaternary ammonium salts, reformate, light naphtha, kerosene

Procedia PDF Downloads 335