Search results for: Language acquisition
505 Reduced Dynamic Time Warping for Handwriting Recognition Based on Multidimensional Time Series of a Novel Pen Device
Authors: Muzaffar Bashir, Jürgen Kempf
Abstract:
The purpose of this paper is to present a Dynamic Time Warping technique which reduces significantly the data processing time and memory size of multi-dimensional time series sampled by the biometric smart pen device BiSP. The acquisition device is a novel ballpoint pen equipped with a diversity of sensors for monitoring the kinematics and dynamics of handwriting movement. The DTW algorithm has been applied for time series analysis of five different sensor channels providing pressure, acceleration and tilt data of the pen generated during handwriting on a paper pad. But the standard DTW has processing time and memory space problems which limit its practical use for online handwriting recognition. To face with this problem the DTW has been applied to the sum of the five sensor signals after an adequate down-sampling of the data. Preliminary results have shown that processing time and memory size could significantly be reduced without deterioration of performance in single character and word recognition. Further excellent accuracy in recognition was achieved which is mainly due to the reduced dynamic time warping RDTW technique and a novel pen device BiSP.Keywords: Biometric character recognition, biometric person authentication, biometric smart pen BiSP, dynamic time warping DTW, online-handwriting recognition, multidimensional time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2406504 Dynamic Metadata Schemes in the Neutron and Photon Science Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata is one of the most important aspects for advancing data management practices within all research communities. Definitions and schemes of metadata are inter alia of particular significance in the domain of neutron and photon scattering experiments covering a broad area of different scientific disciplines. The demand of describing continuously evolving highly non-standardized experiments, including the resulting processed and published data, constitutes a considerable challenge for a static definition of metadata. Here, we present the concept of dynamic metadata for the neutron and photon scientific community, which enriches a static set of defined basic metadata. We explore the idea of dynamic metadata with the help of the use case of X-ray Photon Correlation Spectroscopy (XPCS), which is a synchrotron-based scattering technique that allows the investigation of nanoscale dynamic processes. It serves here as a demonstrator of how dynamic metadata can improve data acquisition, sharing, and analysis workflows. Our approach enables researchers to tailor metadata definitions dynamically and adapt them to the evolving demands of describing data and results from a diverse set of experiments. We demonstrate that dynamic metadata standards yield advantages that enhance data reproducibility, interoperability, and the dissemination of knowledge.
Keywords: Big data, metadata, schemas, XPCS, X-ray Photon Correlation Spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 149503 Effects of Cerium Oxide Nanoparticle Addition in Diesel and Diesel-Biodiesel Blends on the Performance Characteristics of a CI Engine
Authors: Abbas Alli Taghipoor Bafghi, Hosein Bakhoda, Fateme Khodaei Chegeni
Abstract:
An experimental investigation is carried out to establish the performance characteristics of a compression ignition engine while using cerium oxide nanoparticles as additive in neat diesel and diesel-biodiesel blends. In the first phase of the experiments, stability of neat diesel and diesel-biodiesel fuel blends with the addition of cerium oxide nanoparticles is analyzed. After series of experiments, it is found that the blends subjected to high speed blending followed by ultrasonic bath stabilization improves the stability. In the second phase, performance characteristics are studied using the stable fuel blends in a single cylinder four stroke engine coupled with an electrical dynamometer and a data acquisition system. The cerium oxide acts as an oxygen donating catalyst and provides oxygen for combustion. The activation energy of cerium oxide acts to burn off carbon deposits within the engine cylinder at the wall temperature and prevents the deposition of non-polar compounds on the cylinder wall results reduction in HC emissions. The tests revealed that cerium oxide nanoparticles can be used as additive in diesel and diesel-biodiesel blends to improve complete combustion of the fuel significantly.Keywords: Diesel engine, cerium oxide, diesel-biodiesel blends, nanoparticles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4811502 Greek Compounds: A Challenging Case for the Parsing Techniques of PC-KIMMO v.2
Authors: Angela Ralli, Eleni Galiotou
Abstract:
In this paper we describe the recognition process of Greek compound words using the PC-KIMMO software. We try to show certain limitations of the system with respect to the principles of compound formation in Greek. Moreover, we discuss the computational processing of phenomena such as stress and syllabification which are indispensable for the analysis of such constructions and we try to propose linguistically-acceptable solutions within the particular system.
Keywords: Morpho-phonological parsing, compound words, two-level morphology, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609501 Evaluation of Pragmatic Information in an English Textbook: Focus on Requests
Authors: Israa A. Qari
Abstract:
Learning to request in a foreign language is a key ability within pragmatics language teaching. This paper examines how requests are taught in English Unlimited Book 3 (Cambridge University Press), an EFL textbook series employed by King Abdulaziz University in Jeddah, Saudi Arabia to teach advanced foundation year students English. The focus of analysis is the evaluation of the request linguistic strategies present in the textbook, frequency of the use of these strategies, and the contextual information provided on the use of these linguistic forms. The researcher collected all the linguistic forms which consisted of the request speech act and divided them into levels employing the CCSARP request coding manual. Findings demonstrated that simple and commonly employed request strategies are introduced. Looking closely at the exercises throughout the chapters, it was noticeable that the book exclusively employed the most direct form of requesting (the imperative) when giving learners instructions: e.g. listen, write, ask, answer, read, look, complete, choose, talk, think, etc. The book also made use of some other request strategies such as ‘hedged performatives’ and ‘query preparatory’. However, it was also found that many strategies were not dealt with in the book, specifically strategies with combined functions (e.g. possibility, ability). On a sociopragmatic level, a strong focus was found to exist on standard situations in which relations between the requester and requestee are clear. In general, contextual information was communicated implicitly only. The textbook did not seem to differentiate between formal and informal request contexts (register) which might consequently impel students to overgeneralize. The paper closes with some recommendations for textbook and curriculum designers. Findings are also contrasted with previous results from similar body of research on EFL requests.
Keywords: EFL, Requests, Saudi, speech acts, textbook evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 453500 Reading against the Grain: Transcodifying Stimulus Meaning
Authors: Aba-Carina Pârlog
Abstract:
The paper shows that on transferring sense from the SL to the TL, the translator’s reading against the grain determines the creation of a faulty pattern of rendering the original meaning in the receiving culture which reflects the use of misleading transformative codes. In this case, the translator is a writer per se who decides what goes in and out of the book, how the style is to be ciphered and what elements of ideology are to be highlighted. The paper also proves that figurative language must not be flattened for the sake of clarity or naturalness. The missing figurative elements make the translated text less interesting, less challenging and less vivid which reflects poorly on the writer. There is a close connection between style and the writer’s person. If the writer’s style is very much altered in a translation, the translation is useless as the original writer and his / her imaginative world can no longer be discovered. The purpose of the paper is to prove that adaptation is a dangerous tool which leads to variants that sometimes reflect the original less than the reader would wish to. It contradicts the very essence of the process of translation which is that of making an original work available in a foreign language. If the adaptive transformative codes are so flexible that they encourage the translator to repeatedly leave out parts of the original work, then a subversive pattern emerges which changes the entire book. In conclusion, as a result of using adaptation, manipulative or subversive effects are created in the translated work. This is generally achieved by adding new words or connotations, creating new figures of speech or using explicitations. The additional meanings of the original work are neglected and the translator creates new meanings, implications, emphases and contexts. Again s/he turns into a new author who enjoys the freedom of expressing his / her own ideas without the constraints of the original text. Reading against the grain is unadvisable during the process of translation and consequently, following personal common sense becomes essential in the field of translation as well as everywhere else, so that translation should not become a source of fantasy.Keywords: Speculative aesthetics, substance of expression, transformative code, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653499 A Cross-Cultural Approach for Communication with Biological and Non-Biological Intelligences
Authors: Thomas Schalow
Abstract:
This paper posits the need to take a cross-cultural approach to communication with non-human cultures and intelligences in order to meet the following three imminent contingencies: communicating with sentient biological intelligences, communicating with extraterrestrial intelligences, and communicating with artificial super-intelligences. The paper begins with a discussion of how intelligence emerges. It disputes some common assumptions we maintain about consciousness, intention, and language. The paper next explores cross-cultural communication among humans, including non-sapiens species. The next argument made is that we need to become much more serious about communicating with the non-human, intelligent life forms that already exist around us here on Earth. There is an urgent need to broaden our definition of communication and reach out to the other sentient life forms that inhabit our world. The paper next examines the science and philosophy behind CETI (communication with extraterrestrial intelligences) and how it has proven useful, even in the absence of contact with alien life. However, CETI’s assumptions and methodology need to be revised and based on the cross-cultural approach to communication proposed in this paper if we are truly serious about finding and communicating with life beyond Earth. The final theme explored in this paper is communication with non-biological super-intelligences using a cross-cultural communication approach. This will present a serious challenge for humanity, as we have never been truly compelled to converse with other species, and our failure to seriously consider such intercourse has left us largely unprepared to deal with communication in a future that will be mediated and controlled by computer algorithms. Fortunately, our experience dealing with other human cultures can provide us with a framework for this communication. The basic assumptions behind intercultural communication can be applied to the many types of communication envisioned in this paper if we are willing to recognize that we are in fact dealing with other cultures when we interact with other species, alien life, and artificial super-intelligence. The ideas considered in this paper will require a new mindset for humanity, but a new disposition will prepare us to face the challenges posed by a future dominated by artificial intelligence.
Keywords: Artificial intelligence, CETI, communication, culture, language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988498 An Analysis of Learners’ Reports for Measuring Co-Creational Education
Authors: Takatoshi Ishii, Koji Kimita, Keiichi Muramatsu, Yoshiki Shimomura
Abstract:
To increase the quality of learning, teacher and learner need mutual effort for realization of educational value. For this purpose, we need to manage the co-creational education among teacher and learners. In this research, we try to find a feature of co-creational education. To be more precise, we analyzed learners’ reports by natural language processing, and extract some features that describe the state of the co-creational education.Keywords: Co-creational education, e-portfolios, ICT integration, labeled Latent Dirichlet allocation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673497 Knowledge Management in Academic: A Perspective of Academic Research Contribution to Economic Development of a Nation
Authors: Hilary J. Watsilla, Narasimha R. Vajjhala
Abstract:
Information and Communication Technology (ICT) has made information access easier and affordable. Academic research has also benefited from this, with online journals and academic resource readily available by the click of a button. However, there are limited ways of assessing and controlling the quality of the academic research mostly in public institution. Nigeria is the most populous country in Africa with a significant number of universities and young population. The quality of knowledge created by academic researchers, however, needs to be evaluated due to the high number of predatory journals published by academia. The purpose of this qualitative study is to look at the knowledge creation, acquisition, and assimilation process by academic researchers in public universities in Nigeria. Qualitative research will be carried out using in-depth interviews and observations. Academic researchers will be interviewed and absorptive capacity theory will be used as the theoretical framework to guide the research. The findings from this study should help understand the impact of ICT on the knowledge creation process in academic research and to understand how ICT can affect the quality of knowledge produced by researchers. The findings from this study should help add value to the existing body of knowledge on the quality of academic research, especially in Africa where there is limited availability of quality academic research. As this study is limited to Nigerian universities, the outcome may not be generalized to other developing countries.
Keywords: Knowledge creation, academic research, knowledge management, information and communication technology, research, university.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1303496 A Group Setting of IED in Microgrid Protection Management System
Authors: Jyh-Cherng Gu, Ming-Ta Yang, Chao-Fong Yan, Hsin-Yung Chung, Yung-Ruei Chang, Yih-Der Lee, Chen-Min Chan, Chia-Hao Hsu
Abstract:
There are a number of Distributed Generations (DGs) installed in microgrid, which may have diverse path and direction of power flow or fault current. The overcurrent protection scheme for the traditional radial type distribution system will no longer meet the needs of microgrid protection. Integrating the Intelligent Electronic Device (IED) and a Supervisory Control and Data Acquisition (SCADA) with IEC 61850 communication protocol, the paper proposes a Microgrid Protection Management System (MPMS) to protect power system from the fault. In the proposed method, the MPMS performs logic programming of each IED to coordinate their tripping sequence. The GOOSE message defined in IEC 61850 is used as the transmission information medium among IEDs. Moreover, to cope with the difference in fault current of microgrid between grid-connected mode and islanded mode, the proposed MPMS applies the group setting feature of IED to protect system and robust adaptability. Once the microgrid topology varies, the MPMS will recalculate the fault current and update the group setting of IED. Provided there is a fault, IEDs will isolate the fault at once. Finally, the Matlab/Simulink and Elipse Power Studio software are used to simulate and demonstrate the feasibility of the proposed method.Keywords: IEC 61850, IED, Group Setting, Microgrid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2269495 A Mark-Up Approach to Add Value
Authors: Ivaylo I. Atanasov, Evelina N.Pencheva
Abstract:
This paper presents a mark-up approach to service creation in Next Generation Networks. The approach allows deriving added value from network functions exposed by Parlay/OSA (Open Service Access) interfaces. With OSA interfaces service logic scripts might be executed both on callrelated and call-unrelated events. To illustrate the approach XMLbased language constructions for data and method definitions, flow control, time measuring and supervision and database access are given and an example of OSA application is considered.
Keywords: Service creation, mark-up approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1687494 Computational Modeling in Strategic Marketing
Authors: Petr Cernohorsky, Jan Voracek
Abstract:
Well-developed strategic marketing planning is the essential prerequisite for establishment of the right and unique competitive advantage. Typical market, however, is a heterogeneous and decentralized structure with natural involvement of individual or group subjectivity and irrationality. These features cannot be fully expressed with one-shot rigorous formal models based on, e.g. mathematics, statistics or empirical formulas. We present an innovative solution, extending the domain of agent based computational economics towards the concept of hybrid modeling in service provider and consumer market such as telecommunications. The behavior of the market is described by two classes of agents - consumer and service provider agents - whose internal dynamics are fundamentally different. Customers are rather free multi-state structures, adjusting behavior and preferences quickly in accordance with time and changing environment. Producers, on the contrary, are traditionally structured companies with comparable internal processes and specific managerial policies. Their business momentum is higher and immediate reaction possibilities limited. This limitation underlines importance of proper strategic planning as the main process advising managers in time whether to continue with more or less the same business or whether to consider the need for future structural changes that would ensure retention of existing customers or acquisition of new ones.Keywords: Agent-based computational economics, hybrid modeling, strategic marketing, system dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641493 Using Satellite Images Datasets for Road Intersection Detection in Route Planning
Authors: Fatma El-zahraa El-taher, Ayman Taha, Jane Courtney, Susan Mckeever
Abstract:
Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions is critical to decisions such as crossing roads or selecting safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset are examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of detection of intersections in satellite images is evaluated.
Keywords: Satellite images, remote sensing images, data acquisition, autonomous vehicles, robot navigation, route planning, road intersections.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 760492 An Enhanced SAR-Based Tsunami Detection System
Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah
Abstract:
Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.
Keywords: Detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2176491 Using the Nerlovian Adjustment Model to Assess the Response of Farmers to Price and Other Related Factors: Evidence from Sierra Leone Rice Cultivation
Authors: Alhaji M. H. Conteh, Xiangbin Yan, Alfred V. Gborie
Abstract:
The goal of this study was to increase the awareness of the description and assessments of rice acreage response and to offer mechanisms for agricultural policy scrutiny. The ordinary least square (OLS) technique was utilized to determine the coefficients of acreage response models for the rice varieties. The magnitudes of the coefficients (λ) of both the ROK lagged and NERICA lagged acreages were found positive and highly significant, which indicates that farmers’ adjustment rate was very low. Regarding lagged actual price for both the ROK and NERICE rice varieties, the short-run price elasticitieswere lower than long-run, which is suggesting a long term adjustment of the acreage under the crop.
However, the apparent recommendations for policy transformation are to open farm gate prices and to decrease government’s involvement in agricultural sector especially in the acquisition of agricultural inputs. Impending research have to be centered on how this might be better realized. Necessary conditions should be made available to the private sector by means of minimizing price volatility. In accordance with structural reforms, it is necessary to convey output prices to farmers with minimum distortion. There is need to eradicate price subsidies and control, which generate distortion in the market in addition to huge financial costs.
Keywords: Acreage response, rate of adjustment, rice varieties, Sierra Leone.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3791490 Named Entity Recognition using Support Vector Machine: A Language Independent Approach
Authors: Asif Ekbal, Sivaji Bandyopadhyay
Abstract:
Named Entity Recognition (NER) aims to classify each word of a document into predefined target named entity classes and is now-a-days considered to be fundamental for many Natural Language Processing (NLP) tasks such as information retrieval, machine translation, information extraction, question answering systems and others. This paper reports about the development of a NER system for Bengali and Hindi using Support Vector Machine (SVM). Though this state of the art machine learning technique has been widely applied to NER in several well-studied languages, the use of this technique to Indian languages (ILs) is very new. The system makes use of the different contextual information of the words along with the variety of features that are helpful in predicting the four different named (NE) classes, such as Person name, Location name, Organization name and Miscellaneous name. We have used the annotated corpora of 122,467 tokens of Bengali and 502,974 tokens of Hindi tagged with the twelve different NE classes 1, defined as part of the IJCNLP-08 NER Shared Task for South and South East Asian Languages (SSEAL) 2. In addition, we have manually annotated 150K wordforms of the Bengali news corpus, developed from the web-archive of a leading Bengali newspaper. We have also developed an unsupervised algorithm in order to generate the lexical context patterns from a part of the unlabeled Bengali news corpus. Lexical patterns have been used as the features of SVM in order to improve the system performance. The NER system has been tested with the gold standard test sets of 35K, and 60K tokens for Bengali, and Hindi, respectively. Evaluation results have demonstrated the recall, precision, and f-score values of 88.61%, 80.12%, and 84.15%, respectively, for Bengali and 80.23%, 74.34%, and 77.17%, respectively, for Hindi. Results show the improvement in the f-score by 5.13% with the use of context patterns. Statistical analysis, ANOVA is also performed to compare the performance of the proposed NER system with that of the existing HMM based system for both the languages.
Keywords: Named Entity (NE), Named Entity Recognition (NER), Support Vector Machine (SVM), Bengali, Hindi.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3404489 Evaluation of Food Safety Management Systems of Food Service Establishments within the Greater Accra Region
Authors: Benjamin Osei-Tutu
Abstract:
Food contaminated with biological, chemical and physical hazards usually leads to foodborne illnesses which in turn increase the disease burden of developing and developed economies. Restaurants play a key role in the food service industry and violations in application of standardized food safety management systems in these establishments have been associated with foodborne disease outbreaks. This study was undertaken to assess the level of compliance to the Code of practice that was developed and implemented after conducting needs assessment of the food safety management systems employed by the Food Service Establishments in Ghana. Data on pre-licence inspections were reviewed to assess the compliance of the Food Service Establishments. During the period under review (2012-2016), 74.52% of the food service facilities in the hospitality industry were in compliance with the FDA’s code of practice. Main violations observed during the study bordered on facility layout and fabrication (61.8%) and this is because these facilities may not have been built for use as a food service establishment. Another fact that came to the fore was that the redesigning of the facilities to bring them into compliance required capital intensive investments, which some establishments are not prepared for. Other challenges faced by the industry regarded issues on records and documentations, personnel facilities and hygiene, raw materials acquisition, storage and control, and cold storage.
Keywords: Assessment, Accra, food safety management systems, restaurants, hotel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817488 Low Value Capacitance Measurement System with Adjustable Lead Capacitance Compensation
Authors: Gautam Sarkar, Anjan Rakshit, Amitava Chatterjee, Kesab Bhattacharya
Abstract:
The present paper describes the development of a low cost, highly accurate low capacitance measurement system that can be used over a range of 0 – 400 pF with a resolution of 1 pF. The range of capacitance may be easily altered by a simple resistance or capacitance variation of the measurement circuit. This capacitance measurement system uses quad two-input NAND Schmitt trigger circuit CD4093B with hysteresis for the measurement and this system is integrated with PIC 18F2550 microcontroller for data acquisition purpose. The microcontroller interacts with software developed in the PC end through USB architecture and an attractive graphical user interface (GUI) based system is developed in the PC end to provide the user with real time, online display of capacitance under measurement. The system uses a differential mode of capacitance measurement, with reference to a trimmer capacitance, that effectively compensates lead capacitances, a notorious error encountered in usual low capacitance measurements. The hysteresis provided in the Schmitt-trigger circuits enable reliable operation of the system by greatly minimizing the possibility of false triggering because of stray interferences, usually regarded as another source of significant error. The real life testing of the proposed system showed that our measurements could produce highly accurate capacitance measurements, when compared to cutting edge, high end digital capacitance meters.
Keywords: Capacitance measurement, NAND Schmitt trigger, microcontroller, GUI, lead compensation, hysteresis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7370487 Characteristics of Cognitive Functions among Polish Adolescence with Spelling Disorders
Authors: Izabela Pietras
Abstract:
The level of visual abilities, language, memory processes and intellectual functioning development affects the quality of a written text. The following analysis will present the results of diagnostic tests indicating the most common criterion for a group and stating whether a person has been diagnosed with having cognitive developmental level below the group-s average or not.The study-s aim is to determine whether there are specific patterns of cognitive deficits, which can be distinguished among the group of young people with spelling disorders.Keywords: cognitive deficits, cognitive functions, spellingdisorders
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385486 Effect of Flowrate and Coolant Temperature on the Efficiency of Progressive Freeze Concentration on Simulated Wastewater
Authors: M. Jusoh, R. Mohd Yunus, M. A. Abu Hassan
Abstract:
Freeze concentration freezes or crystallises the water molecules out as ice crystals and leaves behind a highly concentrated solution. In conventional suspension freeze concentration where ice crystals formed as a suspension in the mother liquor, separation of ice is difficult. The size of the ice crystals is still very limited which will require usage of scraped surface heat exchangers, which is very expensive and accounted for approximately 30% of the capital cost. This research is conducted using a newer method of freeze concentration, which is progressive freeze concentration. Ice crystals were formed as a layer on the designed heat exchanger surface. In this particular research, a helical structured copper crystallisation chamber was designed and fabricated. The effect of two operating conditions on the performance of the newly designed crystallisation chamber was investigated, which are circulation flowrate and coolant temperature. The performance of the design was evaluated by the effective partition constant, K, calculated from the volume and concentration of the solid and liquid phase. The system was also monitored by a data acquisition tool in order to see the temperature profile throughout the process. On completing the experimental work, it was found that higher flowrate resulted in a lower K, which translated into high efficiency. The efficiency is the highest at 1000 ml/min. It was also found that the process gives the highest efficiency at a coolant temperature of -6 °C.Keywords: Freeze concentration, progressive freeze concentration, freeze wastewater treatment, ice crystals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2176485 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting: The WICKED Method
Authors: S. Impey, D. Berry, S. Furtado, M. Galvin, L. Grogan, O. Hardiman, L. Hederman, M. Heverin, V. Wade, L. Douris, D. O'Sullivan, G. Stephens
Abstract:
Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.
Keywords: Healthcare, knowledge acquisition, maximal data sets, action design science.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 545484 Fabrication and Characterization of Al2O3 Based Electrical Insulation Coatings Around SiC Fibers
Authors: S. Palaniyappan, P. K. Chennam, M. Trautmann, H. Ahmad, T. Mehner, T. Lampke, G. Wagner
Abstract:
In structural-health monitoring of fiber reinforced plastics (FRPs), every single inorganic fiber sensor that are integrated into the bulk material requires an electrical insulation around itself, when the surrounding reinforcing fibers are electrically conductive. This results in a more accurate data acquisition only from the sensor fiber without any electrical interventions. For this purpose, thin nano-films of aluminium oxide (Al2O3)-based electrical-insulation coatings have been fabricated around the Silicon Carbide (SiC) single fiber sensors through reactive DC magnetron sputtering technique. The sputtered coatings were amorphous in nature and the thickness of the coatings increased with an increase in the sputter time. Microstructural characterization of the coated fibers performed using scanning electron microscopy (SEM) confirmed a homogeneous circumferential coating with no detectable defects or cracks on the surface. X-ray diffraction (XRD) analyses of the as-sputtered and 2 hours annealed coatings (825 & 1125 ˚C) revealed the amorphous and crystalline phases of Al2O3 respectively. Raman spectroscopic analyses produced no characteristic bands of Al2O3, as the thickness of the films was in the nanometer (nm) range, which is too small to overcome the actual penetration depth of the laser used. In addition, the influence of the insulation coatings on the mechanical properties of the SiC sensor fibers has been analyzed.
Keywords: Al2O3 insulation coating, reactive sputtering, SiC single fiber sensor, single fiber tensile test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911483 A Formatting Method for Transforming XML Data into HTML
Authors: Zhe JIN, Motomichi TOYAMA
Abstract:
In this paper, we propose a fixed formatting method of PPX(Pretty Printer for XML). PPX is a query language for XML database which has extensive formatting capability that produces HTML as the result of a query. The fixed formatting method is to completely specify the combination of variables and layout specification operators within the layout expression of the GENERATE clause of PPX. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing the same tasks.
Keywords: PPX, XML, HTML, XSLT, XQuery, fixed formatting method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1364482 Challenges and Opportunities of Utilization of Social Media by Business Education Students in Nigeria Universities
Authors: Titus Amodu Umoru
Abstract:
Global economy today is full of sophistication. All over the world, business and marketing practices are undergoing unprecedented transformation. In realization of this fact, the federal government of Nigeria has put in place a robust transformation agenda in order to put Nigeria in a better position to be a competitive player and in the process transform all sectors of its economy. New technologies, especially the Internet, are the driving force behind this transformation. However, technology has inadvertently affected the way businesses are done thus necessitating the acquisition of new skills. In developing countries like Nigeria, citizens are still battling with effective application of those technologies. Obviously, students of business education need to acquire relevant business knowledge to be able to transit into the world of work on graduation from school and compete favorably in the labor market. Therefore, effective utilization of social media by both teachers and students can help extensively in empowering students with the needed skills. Social media which is a group of Internet-based applications built on the ideological foundations of Web 2.0, that allow the creation and exchange of user generated content, and if incorporated into the classroom experience may be the needed answer to unemployment and poverty in Nigeria as beneficiaries can easily connect with existing and potential enterprises and customers, engage with them and reinforce mutual business benefits. Challenges and benefits of social media use in education in Nigeria universities were revealed in this study.Keywords: Challenges, opportunities, utilization, social media, business education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3593481 Design, Modeling and Fabrication of a Tactile Sensor and Display System for Application in Laparoscopic Surgery
Authors: M. Ramezanifard, J. Dargahi, S. Najarian, N. Narayanan
Abstract:
One of the major disadvantages of the minimally invasive surgery (MIS) is the lack of tactile feedback to the surgeon. In order to identify and avoid any damage to the grasped complex tissue by endoscopic graspers, it is important to measure the local softness of tissue during MIS. One way to display the measured softness to the surgeon is a graphical method. In this paper, a new tactile sensor has been reported. The tactile sensor consists of an array of four softness sensors, which are integrated into the jaws of a modified commercial endoscopic grasper. Each individual softness sensor consists of two piezoelectric polymer Polyvinylidene Fluoride (PVDF) films, which are positioned below a rigid and a compliant cylinder. The compliant cylinder is fabricated using a micro molding technique. The combination of output voltages from PVDF films is used to determine the softness of the grasped object. The theoretical analysis of the sensor is also presented. A method has been developed with the aim of reproducing the tactile softness to the surgeon by using a graphical method. In this approach, the proposed system, including the interfacing and the data acquisition card, receives signals from the array of softness sensors. After the signals are processed, the tactile information is displayed by means of a color coding method. It is shown that the degrees of softness of the grasped objects/tissues can be visually differentiated and displayed on a monitor.Keywords: Minimally invasive surgery, Robotic surgery, Sensor, Softness, Tactile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711480 A Study of Combined Mechanical and Chemical Stabilisation of Fine Grained Dredge Soil of River Jhelum
Authors: Adnan F. Sheikh, Fayaz A. Mir
Abstract:
After the recent devastating flood in Kashmir in 2014, dredging of the local water bodies, especially Jhelum River has become a priority for the government. Local government under the project name of 'Comprehensive Flood Management Programme' plans to undertake an increase in discharge of existing flood channels by removal of encroachments and acquisition of additional land, dredging and other works of the water bodies. The total quantity of soil to be dredged will be 16.15 lac cumecs. Dredged soil is a major component that would result from the project which requires disposal/utilization. This study analyses the effect of cement and sand on the engineering properties of soil. The tests were conducted with variable additions of sand (10%, 20% and 30%), whereas cement was added at 12%. Samples with following compositions: soil-cement (12%) and soil-sand (30%) were tested as well. Laboratory experiments were conducted to determine the engineering characteristics of soil, i.e., compaction, strength, and CBR characteristics. The strength characteristics of the soil were determined by unconfined compressive strength test and direct shear test. Unconfined compressive strength of the soil was tested immediately and for a curing period of seven days. CBR test was performed for unsoaked, soaked (worst condition- 4 days) and cured (4 days) samples.
Keywords: Comprehensive flood management programme, dredge soil, strength characteristics, flood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 887479 Computational Design of Inhibitory Agents of BMP-Noggin Interaction to Promote Osteogenesis
Authors: Shaila Ahmed, Raghu Prasad Rao Metpally, Sreedhara Sangadala, Boojala Vijay B Reddy
Abstract:
Bone growth factors, such as Bone Morphogenic Protein-2 (BMP-2) have been approved by the FDA to replace grafting for some surgical interventions, but the high dose requirement limits its use in patients. Noggin, an extracellular protein, blocks the effect of BMP-2 by binding to BMP. Preventing the BMP-2/noggin interaction will help increase the free concentration of BMP-2 and therefore should enhance its efficacy to induce bone formation. The work presented here involves computational design of novel small molecule inhibitory agents of BMP-2/noggin interaction, based on our current understanding of BMP-2, and its known putative ligands (receptors and antagonists). A successful acquisition of such an inhibitory agent of BMP-2/noggin interaction would allow clinicians to reduce the dose required of BMP-2 protein in clinical applications to promote osteogenesis. The available crystal structures of the BMPs, its receptors, and the binding partner noggin were analyzed to identify the critical residues involved in their interaction. In presenting this study, LUDI de novo design method was utilized to perform virtual screening of a large number of compounds from a commercially available library against the binding sites of noggin to identify the lead chemical compounds that could potentially block BMP-noggin interaction with a high specificity.Keywords: Transforming growth factor-beta, Bone morphogenic proteins, Noggin, LUDI de novo design method, CAP small molecules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920478 Making Computer Learn Color
Authors: Rinaldo Christian Tanumara, Ming Xie
Abstract:
Color categorization is shared among members in a society. This allows communication of color, especially when using natural language such as English. Hence sociable robot, to live coexist with human in human society, must also have the shared color categorization. To achieve this, many works have been done relying on modeling of human color perception and mathematical complexities. In contrast, in this work, the computer as brain of the robot learns color categorization through interaction with humans without much mathematical complexities.Keywords: Color categorization, color learning, machinelearning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441477 Simulation and 40 Years of Object-Oriented Programming
Authors: Eugene Kindler
Abstract:
2007 is a jubilee year: in 1967, programming language SIMULA 67 was presented, which contained all aspects of what was later called object-oriented programming. The present paper contains a description of the development unto the objectoriented programming, the role of simulation in this development and other tools that appeared in SIMULA 67 and that are nowadays called super-object-oriented programming.
Keywords: Simulation, super-object-oriented programming, object-oriented programming, SIMULA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1317476 X-Ray Intensity Measurement Using Frequency Output Sensor for Computed Tomography
Authors: R. M. Siddiqui, D. Z. Moghaddam, T. R. Turlapati, S. H. Khan, I. Ul Ahad
Abstract:
Quality of 2D and 3D cross-sectional images produce by Computed Tomography primarily depend upon the degree of precision of primary and secondary X-Ray intensity detection. Traditional method of primary intensity detection is apt to errors. Recently the X-Ray intensity measurement system along with smart X-Ray sensors is developed by our group which is able to detect primary X-Ray intensity unerringly. In this study a new smart X-Ray sensor is developed using Light-to-Frequency converter TSL230 from Texas Instruments which has numerous advantages in terms of noiseless data acquisition and transmission. TSL230 construction is based on a silicon photodiode which converts incoming X-Ray radiation into the proportional current signal. A current to frequency converter is attached to this photodiode on a single monolithic CMOS integrated circuit which provides proportional frequency count to incoming current signal in the form of the pulse train. The frequency count is delivered to the center of PICDEM FS USB board with PIC18F4550 microcontroller mounted on it. With highly compact electronic hardware, this Demo Board efficiently read the smart sensor output data. The frequency output approaches overcome nonlinear behavior of sensors with analog output thus un-attenuated X-Ray intensities could be measured precisely and better normalization could be acquired in order to attain high resolution.Keywords: Computed tomography, detector technology, X-Ray intensity measurement
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2609