Search results for: semantic computing
394 The Importance of Working Memory, Executive and Attention Functions in Attention Deficit Hyperactivity Disorder and Learning Disabilities Diagnostics
Authors: Dorottya Horváth, Tímea Harmath-Tánczos
Abstract:
Attention deficit hyperactivity disorder (ADHD) and learning disabilities are common neurocognitive disorders that can have a significant impact on a child's academic performance. ADHD is characterized by inattention, hyperactivity, and impulsivity, while learning disabilities are characterized by difficulty with specific academic skills, such as reading, writing, or math. The aim of this study was to investigate the working memory, executive, and attention functions of neurotypical children and children with ADHD and learning disabilities in order to fill the gaps in the Hungarian mean test scores of these cognitive functions in children with neurocognitive disorders. Another aim was to specify the neuropsychological differential diagnostic toolkit in terms of the relationships and peculiarities between these cognitive functions. The research question addressed in this study was: How do the working memory, executive, and attention functions of neurotypical children compare to those of children with ADHD and learning disabilities? A self-administered test battery was used as a research tool. Working memory was measured with the Non-Word Repetition Test, the Listening Span Test, the Digit Span Test, and the Reverse Digit Span Test; executive function with the Letter Fluency, Semantic Fluency, and Verb Fluency Tests; and attentional concentration with the d2-R Test. The data for this study was collected from 115 children aged 9-14 years. The children were divided into three groups: neurotypical children (n = 44), children with ADHD without learning disabilities (n = 23), and children with ADHD with learning disabilities (n = 48). The data was analyzed using a variety of statistical methods, including t-tests, ANOVAs, and correlational analyses. The results showed that the performance of children with neurocognitive involvement in working memory, executive functions, and attention was significantly lower than the performance of neurotypical children. However, the results of children with ADHD and ADHD with learning disabilities did not show a significant difference. The findings of this study are important because they provide new insights into the cognitive profiles of children with ADHD and learning disabilities and suggest that working memory, executive functions, and attention are all impaired in children with neurocognitive involvement, regardless of whether they have ADHD or learning disabilities. This information can be used to develop more effective diagnostic and treatment strategies for these disorders.Keywords: ADHD, attention functions, executive functions, learning disabilities, working memory
Procedia PDF Downloads 95393 An Analysis of Uncoupled Designs in Chicken Egg
Authors: Pratap Sriram Sundar, Chandan Chowdhury, Sagar Kamarthi
Abstract:
Nature has perfected her designs over 3.5 billion years of evolution. Research fields such as biomimicry, biomimetics, bionics, bio-inspired computing, and nature-inspired designs have explored nature-made artifacts and systems to understand nature’s mechanisms and intelligence. Learning from nature, the researchers have generated sustainable designs and innovation in a variety of fields such as energy, architecture, agriculture, transportation, communication, and medicine. Axiomatic design offers a method to judge if a design is good. This paper analyzes design aspects of one of the nature’s amazing object: chicken egg. The functional requirements (FRs) of components of the object are tabulated and mapped on to nature-chosen design parameters (DPs). The ‘independence axiom’ of the axiomatic design methodology is applied to analyze couplings and to evaluate if eggs’ design is good (i.e., uncoupled design) or bad (i.e., coupled design). The analysis revealed that eggs design is a good design, i.e., uncoupled design. This approach can be applied to any nature’s artifacts to judge whether their design is a good or a bad. This methodology is valuable for biomimicry studies. This approach can also be a very useful teaching design consideration of biology and bio-inspired innovation.Keywords: uncoupled design, axiomatic design, nature design, design evaluation
Procedia PDF Downloads 173392 Learning Physics Concepts through Language Syntagmatic Paradigmatic Relations
Authors: C. E. Laburu, M. A. Barros, A. F. Zompero, O. H. M. Silva
Abstract:
The work presents a teaching strategy that employs syntagmatic and paradigmatic linguistic relations in order to monitor the understanding of physics students’ concepts. Syntagmatic and paradigmatic relations are theoretical elements of semiotics studies and our research circumstances and justified them within the research program of multi-modal representations. Among the multi-modal representations to learning scientific knowledge, the scope of action of syntagmatic and paradigmatic relations belongs to the discursive writing form. The use of such relations has the purpose to seek innovate didactic work with discourse representation in the write form before translate to another different representational form. The research was conducted with a sample of first year high school students. The students were asked to produce syntagmatic and paradigmatic of Newton’ first law statement. This statement was delivered in paper for each student that should individually write the relations. The student’s records were collected for analysis. It was possible observed in one student used here as example that their monemes replaced and rearrangements produced by, respectively, syntagmatic and paradigmatic relations, kept the original meaning of the law. In paradigmatic production he specified relevant significant units of the linguistic signs, the monemas, which constitute the first articulation and each word substituted kept equivalence to the original meaning of original monema. Also, it was noted a number of diverse and many monemas were chosen, with balanced combination of grammatical (grammatical monema is what changes the meaning of a word, in certain positions of the syntagma, along with a relatively small number of other monemes. It is the smallest linguistic unit that has grammatical meaning) and lexical (lexical monema is what belongs to unlimited inventories; is the monema endowed with lexical meaning) monemas. In syntagmatic production, monemas ordinations were syntactically coherent, being linked with semantic conservation and preserved number. In general, the results showed that the written representation mode based on linguistic relations paradigmatic and syntagmatic qualifies itself to be used in the classroom as a potential identifier and accompanist of meanings acquired from students in the process of scientific inquiry.Keywords: semiotics, language, high school, physics teaching
Procedia PDF Downloads 131391 Spontaneous and Posed Smile Detection: Deep Learning, Traditional Machine Learning, and Human Performance
Authors: Liang Wang, Beste F. Yuksel, David Guy Brizan
Abstract:
A computational model of affect that can distinguish between spontaneous and posed smiles with no errors on a large, popular data set using deep learning techniques is presented in this paper. A Long Short-Term Memory (LSTM) classifier, a type of Recurrent Neural Network, is utilized and compared to human classification. Results showed that while human classification (mean of 0.7133) was above chance, the LSTM model was more accurate than human classification and other comparable state-of-the-art systems. Additionally, a high accuracy rate was maintained with small amounts of training videos (70 instances). The derivation of important features to further understand the success of our computational model were analyzed, and it was inferred that thousands of pairs of points within the eyes and mouth are important throughout all time segments in a smile. This suggests that distinguishing between a posed and spontaneous smile is a complex task, one which may account for the difficulty and lower accuracy of human classification compared to machine learning models.Keywords: affective computing, affect detection, computer vision, deep learning, human-computer interaction, machine learning, posed smile detection, spontaneous smile detection
Procedia PDF Downloads 125390 Estimation of Implicit Colebrook White Equation by Preferable Explicit Approximations in the Practical Turbulent Pipe Flow
Authors: Itissam Abuiziah
Abstract:
In several hydraulic systems, it is necessary to calculate the head losses which depend on the resistance flow friction factor in Darcy equation. Computing the resistance friction is based on implicit Colebrook-White equation which is considered as the standard for the friction calculation, but it needs high computational cost, therefore; several explicit approximation methods are used for solving an implicit equation to overcome this issue. It follows that the relative error is used to determine the most accurate method among the approximated used ones. Steel, cast iron and polyethylene pipe materials investigated with practical diameters ranged from 0.1m to 2.5m and velocities between 0.6m/s to 3m/s. In short, the results obtained show that the suitable method for some cases may not be accurate for other cases. For example, when using steel pipe materials, Zigrang and Silvester's method has revealed as the most precise in terms of low velocities 0.6 m/s to 1.3m/s. Comparatively, Halland method showed a less relative error with the gradual increase in velocity. Accordingly, the simulation results of this study might be employed by the hydraulic engineers, so they can take advantage to decide which is the most applicable method according to their practical pipe system expectations.Keywords: Colebrook–White, explicit equation, friction factor, hydraulic resistance, implicit equation, Reynolds numbers
Procedia PDF Downloads 187389 Hydroinformatics of Smart Cities: Real-Time Water Quality Prediction Model Using a Hybrid Approach
Authors: Elisa Coraggio, Dawei Han, Weiru Liu, Theo Tryfonas
Abstract:
Water is one of the most important resources for human society. The world is currently undergoing a wave of urban growth, and pollution problems are of a great impact. Monitoring water quality is a key task for the future of the environment and human species. In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for environmental monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the artificial intelligence algorithm. This study derives the methodology and demonstrates its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for the environment monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a new methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the Artificial Intelligence algorithm. This study derives the methodology and demonstrate its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.Keywords: artificial intelligence, hydroinformatics, numerical modelling, smart cities, water quality
Procedia PDF Downloads 188388 Consumer Perception of 3D Body Scanning While Online Shopping for Clothing
Authors: A. Grilec, S. Petrak, M. Mahnic Naglic
Abstract:
Technological development and the globalization in production and sales of clothing in the last decade have significantly influenced the changes in consumer relationship with the industrial-fashioned apparel and in the way of clothing purchasing. The Internet sale of clothing is in a constant and significant increase in the global market, but the possibilities offered by modern computing technologies in the customization segment are not yet fully involved, especially according to the individual customer requirements and body sizes. Considering the growing trend of online shopping, the main goal of this paper is to investigate the differences in customer perceptions towards online apparel shopping and particularly to discover the main differences in perceptions between customers regarding three different body sizes. In order to complete the research goal, the quantitative study on the sample of 85 Croatian consumers was conducted in 2017 in Zagreb, Croatia. Respondents were asked to indicate their level of agreement according to a five-point Likert scale ranging from strongly disagree (1) to strongly agree (5). To analyze attitudes of respondents, simple and descriptive statistics were used. The main findings highlight the differences in respondent perception of 3D body scanning, using 3D body scanning in Internet shopping, online apparel shopping habits regarding their body sizes.Keywords: consumer behavior, Internet, 3D body scanning, body types
Procedia PDF Downloads 164387 Educational Data Mining: The Case of the Department of Mathematics and Computing in the Period 2009-2018
Authors: Mário Ernesto Sitoe, Orlando Zacarias
Abstract:
University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.Keywords: evasion and retention, cross-validation, bagging, stacking
Procedia PDF Downloads 82386 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition
Authors: A. Bayaga
Abstract:
This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.Keywords: AIDS mortality rates, epidemiological model, time-homogeneous markov jump process, transition probability, statistics South Africa
Procedia PDF Downloads 497385 Automatic Verification Technology of Virtual Machine Software Patch on IaaS Cloud
Authors: Yoji Yamato
Abstract:
In this paper, we propose an automatic verification technology of software patches for user virtual environments on IaaS Cloud to decrease verification costs of patches. In these days, IaaS services have been spread and many users can customize virtual machines on IaaS Cloud like their own private servers. Regarding to software patches of OS or middleware installed on virtual machines, users need to adopt and verify these patches by themselves. This task increases operation costs of users. Our proposed method replicates user virtual environments, extracts verification test cases for user virtual environments from test case DB, distributes patches to virtual machines on replicated environments and conducts those test cases automatically on replicated environments. We have implemented the proposed method on OpenStack using Jenkins and confirmed the feasibility. Using the implementation, we confirmed the effectiveness of test case creation efforts by our proposed idea of 2-tier abstraction of software functions and test cases. We also evaluated the automatic verification performance of environment replications, test cases extractions and test cases conductions.Keywords: OpenStack, cloud computing, automatic verification, jenkins
Procedia PDF Downloads 489384 HcDD: The Hybrid Combination of Disk Drives in Active Storage Systems
Authors: Shu Yin, Zhiyang Ding, Jianzhong Huang, Xiaojun Ruan, Xiaomin Zhu, Xiao Qin
Abstract:
Since large-scale and data-intensive applications have been widely deployed, there is a growing demand for high-performance storage systems to support data-intensive applications. Compared with traditional storage systems, next-generation systems will embrace dedicated processor to reduce computational load of host machines and will have hybrid combinations of different storage devices. The advent of flash- memory-based solid state disk has become a critical role in revolutionizing the storage world. However, instead of simply replacing the traditional magnetic hard disk with the solid state disk, it is believed that finding a complementary approach to corporate both of them is more challenging and attractive. This paper explores an idea of active storage, an emerging new storage configuration, in terms of the architecture and design, the parallel processing capability, the cooperation of other machines in cluster computing environment, and a disk configuration, the hybrid combination of different types of disk drives. Experimental results indicate that the proposed HcDD achieves better I/O performance and longer storage system lifespan.Keywords: arallel storage system, hybrid storage system, data inten- sive, solid state disks, reliability
Procedia PDF Downloads 448383 Enhancing Cultural Heritage Data Retrieval by Mapping COURAGE to CIDOC Conceptual Reference Model
Authors: Ghazal Faraj, Andras Micsik
Abstract:
The CIDOC Conceptual Reference Model (CRM) is an extensible ontology that provides integrated access to heterogeneous and digital datasets. The CIDOC-CRM offers a “semantic glue” intended to promote accessibility to several diverse and dispersed sources of cultural heritage data. That is achieved by providing a formal structure for the implicit and explicit concepts and their relationships in the cultural heritage field. The COURAGE (“Cultural Opposition – Understanding the CultuRal HeritAGE of Dissent in the Former Socialist Countries”) project aimed to explore methods about socialist-era cultural resistance during 1950-1990 and planned to serve as a basis for further narratives and digital humanities (DH) research. This project highlights the diversity of flourished alternative cultural scenes in Eastern Europe before 1989. Moreover, the dataset of COURAGE is an online RDF-based registry that consists of historical people, organizations, collections, and featured items. For increasing the inter-links between different datasets and retrieving more relevant data from various data silos, a shared federated ontology for reconciled data is needed. As a first step towards these goals, a full understanding of the CIDOC CRM ontology (target ontology), as well as the COURAGE dataset, was required to start the work. Subsequently, the queries toward the ontology were determined, and a table of equivalent properties from COURAGE and CIDOC CRM was created. The structural diagrams that clarify the mapping process and construct queries are on progress to map person, organization, and collection entities to the ontology. Through mapping the COURAGE dataset to CIDOC-CRM ontology, the dataset will have a common ontological foundation with several other datasets. Therefore, the expected results are: 1) retrieving more detailed data about existing entities, 2) retrieving new entities’ data, 3) aligning COURAGE dataset to a standard vocabulary, 4) running distributed SPARQL queries over several CIDOC-CRM datasets and testing the potentials of distributed query answering using SPARQL. The next plan is to map CIDOC-CRM to other upper-level ontologies or large datasets (e.g., DBpedia, Wikidata), and address similar questions on a wide variety of knowledge bases.Keywords: CIDOC CRM, cultural heritage data, COURAGE dataset, ontology alignment
Procedia PDF Downloads 147382 Meaning Interpretation of Persian Noun-Noun Compounds: A Conceptual Blending Approach
Authors: Bahareh Yousefian, Laurel Smith Stvan
Abstract:
Linguistic structures have two facades: form and meaning. These structures could have either literal meaning or figurative meaning (although it could also depend on the context in which that structure appears). The literal meaning is understandable more easily, but for the figurative meaning, a word or concept is understood from a different word or concept. In linguistic structures with a figurative meaning, it’s more difficult to relate their forms to the meanings than structures with literal meaning. In these cases, the relationship between form and figurative meaning could be studied from different perspectives. Various linguists have been curious about what happens in someone’s mind to understand figurative meaning through the forms; they have used different perspectives and theories to explain this process. It has been studied through cognitive linguistics as well, in which mind and mental activities are really important. In this viewpoint, meaning (in other words, conceptualization) is considered a mental process. In this descriptive-analytic study, 20 Persian compound nouns with figurative meanings have been collected from the Persian-language Moeen Encyclopedic Dictionary and other sources. Examples include [“Sofreh Xaneh”] (traditional restaurant) and [“Dast Yar”] (Assistant). These were studied in a cognitive semantics framework using “Conceptual Blending Theory” which hasn’t been tested on Persian compound nouns before. It was noted that “Conceptual Blending Theory” could lead to the process of understanding the figurative meanings of Persian compound nouns. Many cognitive linguists believe that “Conceptual Blending” is not only a linguistic theory but it’s also a basic human cognitive ability that plays important roles in thought, imagination, and even everyday life as well (though unconsciously). The ability to use mental spaces and conceptual blending (which is exclusive to humankind) is such a basic but unconscious ability that we are unaware of its existence and importance. What differentiates Conceptual Blending Theory from other ways of understanding figurative meaning, are arising new semantic aspects (emergent structure) that lead to a more comprehensive and precise meaning. In this study, it was found that Conceptual Blending Theory could explain reaching the figurative meanings of Persian compound nouns from their forms, such as [talkative for compound word of “Bolbol + Zabani” (nightingale + tongue)] and [wage for compound word of “Dast + Ranj” (hand + suffering)].Keywords: cognitive linguistics, conceptual blending, figurative meaning, Persian compound nouns
Procedia PDF Downloads 77381 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 244380 The Beam Expansion Method, A Simplified and Efficient Approach of Field Propagation and Resonators Modes Study
Authors: Zaia Derrar Kaddour
Abstract:
The study of a beam throughout an optical path is generally achieved by means of diffraction integral. Unfortunately, in some problems, this tool turns out to be not very friendly and hard to implement. Instead, the beam expansion method for computing field profiles appears to be an interesting alternative. The beam expansion method consists of expanding the field pattern as a series expansion in a set of orthogonal functions. Propagating each individual component through a circuit and adding up the derived elements leads easily to the result. The problem is then reduced to finding how the expansion coefficients change in a circuit. The beam expansion method requires a systematic study of each type of optical element that can be met in the considered optical path. In this work, we analyze the following fundamental elements: first order optical systems, hard apertures and waveguides. We show that the former element type is completely defined thanks to the Gouy phase shift expression we provide and the latters require a suitable mode conversion. For endorsing the usefulness and relevance of the beam expansion approach, we show here some of its applications such as the treatment of the thermal lens effect and the study of unstable resonators.Keywords: gouy phase shift, modes, optical resonators, unstable resonators
Procedia PDF Downloads 62379 Orientational Pair Correlation Functions Modelling of the LiCl6H2O by the Hybrid Reverse Monte Carlo: Using an Environment Dependence Interaction Potential
Authors: Mohammed Habchi, Sidi Mohammed Mesli, Rafik Benallal, Mohammed Kotbi
Abstract:
On the basis of four partial correlation functions and some geometric constraints obtained from neutron scattering experiments, a Reverse Monte Carlo (RMC) simulation has been performed in the study of the aqueous electrolyte LiCl6H2O at the glassy state. The obtained 3-dimensional model allows computing pair radial and orientational distribution functions in order to explore the structural features of the system. Unrealistic features appeared in some coordination peaks. To remedy to this, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an additional energy constraint in addition to the usual constraints derived from experiments. The energy of the system is calculated using an Environment Dependence Interaction Potential (EDIP). Ions effects is studied by comparing correlations between water molecules in the solution and in pure water at room temperature Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in orientational distribution curves.Keywords: LiCl6H2O, glassy state, RMC, HRMC
Procedia PDF Downloads 471378 Characteristics of GaAs/InGaP and AlGaAs/GaAs/InAlGaP Npn Heterostructural Optoelectronic Switches
Authors: Der-Feng Guo
Abstract:
Optoelectronic switches have attracted a considerable attention in the semiconductor research field due to their potential applications in optical computing systems and optoelectronic integrated circuits (OEICs). With high gains and high-speed operations, npn heterostructures can be used to produce promising optoelectronic switches. It is known that the bulk barrier and heterostructure-induced potential spike act important roles in the characteristics of the npn heterostructures. To investigate the effects of bulk barrier and potential spike heights on the optoelectronic switching of the npn heterostructures, GaAs/InGaP and AlGaAs/GaAs/InAlGaP npn heterostructural optoelectronic switches (HSOSs) have been fabricated in this work. It is seen that the illumination decreases the switching voltage Vs and increases the switching current Is, and thus the OFF state is under dark and ON state under illumination in the optical switching of the GaAs/InGaP HSOS characteristics. But in the AlGaAs/GaAs/InAlGaP HSOS characteristics, the Vs and Is present contrary trends, and the OFF state is under illumination and ON state under dark. The studied HSOSs show quite different switching variations with incident light, which are mainly attributed to the bulk barrier and potential spike heights affected by photogenerated carriers.Keywords: bulk barrier, heterostructure, optoelectronic switch, potential spike
Procedia PDF Downloads 238377 The KAPSARC Energy Policy Database: Introducing a Quantified Library of China's Energy Policies
Authors: Philipp Galkin
Abstract:
Government policy is a critical factor in the understanding of energy markets. Regardless, it is rarely approached systematically from a research perspective. Gaining a precise understanding of what policies exist, their intended outcomes, geographical extent, duration, evolution, etc. would enable the research community to answer a variety of questions that, for now, are either oversimplified or ignored. Policy, on its surface, also seems a rather unstructured and qualitative undertaking. There may be quantitative components, but incorporating the concept of policy analysis into quantitative analysis remains a challenge. The KAPSARC Energy Policy Database (KEPD) is intended to address these two energy policy research limitations. Our approach is to represent policies within a quantitative library of the specific policy measures contained within a set of legal documents. Each of these measures is recorded into the database as a single entry characterized by a set of qualitative and quantitative attributes. Initially, we have focused on the major laws at the national level that regulate coal in China. However, KAPSARC is engaged in various efforts to apply this methodology to other energy policy domains. To ensure scalability and sustainability of our project, we are exploring semantic processing using automated computer algorithms. Automated coding can provide a more convenient input data for human coders and serve as a quality control option. Our initial findings suggest that the methodology utilized in KEPD could be applied to any set of energy policies. It also provides a convenient tool to facilitate understanding in the energy policy realm enabling the researcher to quickly identify, summarize, and digest policy documents and specific policy measures. The KEPD captures a wide range of information about each individual policy contained within a single policy document. This enables a variety of analyses, such as structural comparison of policy documents, tracing policy evolution, stakeholder analysis, and exploring interdependencies of policies and their attributes with exogenous datasets using statistical tools. The usability and broad range of research implications suggest a need for the continued expansion of the KEPD to encompass a larger scope of policy documents across geographies and energy sectors.Keywords: China, energy policy, policy analysis, policy database
Procedia PDF Downloads 323376 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine
Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji
Abstract:
The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis
Procedia PDF Downloads 320375 Infinite Impulse Response Digital Filters Design
Authors: Phuoc Si Nguyen
Abstract:
Infinite impulse response (IIR) filters can be designed from an analogue low pass prototype by using frequency transformation in the s-domain and bilinear z-transformation with pre-warping frequency; this method is known as frequency transformation from the s-domain to the z-domain. This paper will introduce a new method to transform an IIR digital filter to another type of IIR digital filter (low pass, high pass, band pass, band stop or narrow band) using a technique based on inverse bilinear z-transformation and inverse matrices. First, a matrix equation is derived from inverse bilinear z-transformation and Pascal’s triangle. This Low Pass Digital to Digital Filter Pascal Matrix Equation is used to transform a low pass digital filter to other digital filter types. From this equation and the inverse matrix, a Digital to Digital Filter Pascal Matrix Equation can be derived that is able to transform any IIR digital filter. This paper will also introduce some specific matrices to replace the inverse matrix, which is difficult to determine due to the larger size of the matrix in the current method. This will make computing and hand calculation easier when transforming from one IIR digital filter to another in the digital domain.Keywords: bilinear z-transformation, frequency transformation, inverse bilinear z-transformation, IIR digital filters
Procedia PDF Downloads 423374 Employing Innovative Pedagogy: Collaborative (Online) Learning and Teaching In An International Setting
Authors: Sonja Gögele, Petra Kletzenbauer
Abstract:
International strategies are ranked as one of the core activities in the development plans of Austrian universities. This has led to numerous promising activities in terms of internationalization (i.e. development of international degree programmes, increased staff, and student mobility, and blended international projects). The latest innovative approach are so called Blended Intensive Programmes (BIP), which combine jointly delivered teaching and learning elements of at least three participating ERASMUS universities in a virtual and short-term mobility setup. Students who participate in BIP can maintain their study plans at their home institution and include BIP as a parallel activity. This paper presents the experiences of this programme on the topic of sustainable computing hosted by the University of Applied Sciences FH JOANNEUM. By means of an online survey and face-to-face interviews with all stakeholders (20 students, 8 professors), the empirical study addresses the challenges of hosting an international blended learning programme (i.e. virtual phase and on-site intensive phase) and discusses the impact of such activities in terms of innovative pedagogy (i.e. virtual collaboration, research-based learning).Keywords: internationalization, collaborative learning, blended intensive programme, pedagogy
Procedia PDF Downloads 132373 The Relationship between Spanish Economic Variables: Evidence from the Wavelet Techniques
Authors: Concepcion Gonzalez-Concepcion, Maria Candelaria Gil-Fariña, Celina Pestano-Gabino
Abstract:
We analyze six relevant economic and financial variables for the period 2000M1-2015M3 in the context of the Spanish economy: a financial index (IBEX35), a commodity (Crude Oil Price in euros), a foreign exchange index (EUR/USD), a bond (Spanish 10-Year Bond), the Spanish National Debt and the Consumer Price Index. The goal of this paper is to analyze the main relations between them by computing the Wavelet Power Spectrum and the Cross Wavelet Coherency associated with Morlet wavelets. By using a special toolbox in MATLAB, we focus our interest on the period variable. We decompose the time-frequency effects and improve the interpretation of the results by non-expert users in the theory of wavelets. The empirical evidence shows certain instability periods and reveals various changes and breaks in the causality relationships for sample data. These variables were individually analyzed with Daubechies Wavelets to visualize high-frequency variance, seasonality, and trend. The results are included in Proceeding 20th International Academic Conference, 2015, International Institute of Social and Economic Sciences (IISES), Madrid.Keywords: economic and financial variables, Spain, time-frequency domain, wavelet coherency
Procedia PDF Downloads 240372 The Potential Threat of Cyberterrorism to the National Security: Theoretical Framework
Authors: Abdulrahman S. Alqahtani
Abstract:
The revolution of computing and networks could revolutionise terrorism in the same way that it has brought about changes in other aspects of life. The modern technological era has faced countries with a new set of security challenges. There are many states and potential adversaries who have the potential and capacity in cyberspace, which makes them able to carry out cyber-attacks in the future. Some of them are currently conducting surveillance, gathering and analysis of technical information, and mapping of networks and nodes and infrastructure of opponents, which may be exploited in future conflicts. This poster presents the results of the quantitative study (survey) to test the validity of the proposed theoretical framework for the cyber terrorist threats. This theoretical framework will help to in-depth understand these new digital terrorist threats. It may also be a practical guide for managers and technicians in critical infrastructure, to understand and assess the threats they face. It might also be the foundation for building a national strategy to counter cyberterrorism. In the beginning, it provides basic information about the data. To purify the data, reliability and exploratory factor analysis, as well as confirmatory factor analysis (CFA) were performed. Then, Structural Equation Modelling (SEM) was utilised to test the final model of the theory and to assess the overall goodness-of-fit between the proposed model and the collected data set.Keywords: cyberterrorism, critical infrastructure, , national security, theoretical framework, terrorism
Procedia PDF Downloads 405371 Real-Time Path Planning for Unmanned Air Vehicles Using Improved Rapidly-Exploring Random Tree and Iterative Trajectory Optimization
Authors: A. Ramalho, L. Romeiro, R. Ventura, A. Suleman
Abstract:
A real-time path planning framework for Unmanned Air Vehicles, and in particular multi-rotors is proposed. The framework is designed to provide feasible trajectories from the current UAV position to a goal state, taking into account constraints such as obstacle avoidance, problem kinematics, and vehicle limitations such as maximum speed and maximum acceleration. The framework computes feasible paths online, allowing to avoid new, unknown, dynamic obstacles without fully re-computing the trajectory. These features are achieved using an iterative process in which the robot computes and optimizes the trajectory while performing the mission objectives. A first trajectory is computed using a modified Rapidly-Exploring Random Tree (RRT) algorithm, that provides trajectories that respect a maximum curvature constraint. The trajectory optimization is accomplished using the Interior Point Optimizer (IPOPT) as a solver. The framework has proven to be able to compute a trajectory and optimize to a locally optimal with computational efficiency making it feasible for real-time operations.Keywords: interior point optimization, multi-rotors, online path planning, rapidly exploring random trees, trajectory optimization
Procedia PDF Downloads 135370 Theoretical Analysis of the Solid State and Optical Characteristics of Calcium Sulpide Thin Film
Authors: Emmanuel Ifeanyi Ugwu
Abstract:
Calcium Sulphide which is one of Chalcogenide group of thin films has been analyzed in this work using a theoretical approach in which a scalar wave was propagated through the material thin film medium deposited on a glass substrate with the assumption that the dielectric medium has homogenous reference dielectric constant term, and a perturbed dielectric function, representing the deposited thin film medium on the surface of the glass substrate as represented in this work. These were substituted into a defined scalar wave equation that was solved first of all by transforming it into Volterra equation of second type and solved using the method of separation of variable on scalar wave and subsequently, Green’s function technique was introduced to obtain a model equation of wave propagating through the thin film that was invariably used in computing the propagated field, for different input wavelengths representing UV, Visible and Near-infrared regions of field considering the influence of the dielectric constants of the thin film on the propagating field. The results obtained were used in turn to compute the band gaps, solid state and optical properties of the thin film.Keywords: scalar wave, dielectric constant, calcium sulphide, solid state, optical properties
Procedia PDF Downloads 118369 Measuring the Resilience of e-Governments Using an Ontology
Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips
Abstract:
The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats
Procedia PDF Downloads 337368 Review of the Legislative and Policy Issues in Promoting Infrastructure Development to Promote Automation in Telecom Industry
Authors: Marvin Ricardo Awarab
Abstract:
There has never been a greater need for telecom services. The Internet of Things (IoT), 5G networking, and edge computing are the driving forces behind this increased demand. The fierce demand offers communications service providers significant income opportunities. The telecom sector is centered on automation, and realizing a digital operation that functions as a real-time business will be crucial for the industry as a whole. Automation in telecom refers to the application of technology to create a more effective, quick, and scalable alternative to the conventional method of operating the telecom industry. With the promotion of 5G and the Internet of Things (IoT), telecom companies will continue to invest extensively in telecom automation technology. Automation offers benefits in the telecom industry; developing countries such as Namibia may not fully tap into such benefits because of the lack of funds and infrastructural resources to invest in automation. This paper fully investigates the benefits of automation in the telecom industry. Furthermore, the paper identifies hiccups that developing countries such as Namibia face in their quest to fully introduce automation in the telecom industry. Additionally, the paper proposes possible avenues that Namibia, as a developing country, adopt investing in automation infrastructural resources with the aim of reaping the full benefits of automation in the telecom industry.Keywords: automation, development, internet, internet of things, network, telecom, telecommunications policy, 5G
Procedia PDF Downloads 63367 A Study on Information Structure in the Vajrachedika-Prajna-paramita Sutra and Translation Aspect
Authors: Yoon-Cheol Park
Abstract:
This research focuses on examining the information structures in the old Chinese character-Korean translation of the Vajrachedika-prajna-paramita sutra. The background of this research comes from the fact that there were no previous researches which looked into the information structures in the target text of the Vajrachedika-prajna-paramita sutra by now. The existing researches on the Buddhist scripture translation mainly put weight on message conveyance by literal and semantic translation methods. But the message conveyance from one language to another has a necessity to be delivered with equivalent information structure. Thus, this research is intended to investigate on the flow of old and new information in the target text of Buddhist scripture, compared with source text. The Vajrachedika-prajna-paramita sutra unlike other Buddhist scriptures is composed of conversational structures between Buddha and his disciple, Suboli. This implies that the information flow can be changed by utterance context and some propositions. So, this research tries to analyze the flow of old and new information within the source and target text. As a result of analysis, this research can discover the following facts; firstly, there are the differences of the information flow in the message conveyance between the old Chinese character and Korean by language features. The old Chinese character reveals that old-new information flow is developed, while Korean indicates new-old information flow because of word order. Secondly, the source text of the Vajrachedika-prajna-paramita sutra includes abstruse terminologies, jargon and abstract words. These make influence on the target text and cause the change of the information flow. But the repetitive expressions of these words provide the old information in the target text. Lastly, the Vajrachedika-prajna-paramita sutra offers the expository structure from conversations between Buddha and Suboli. It means that the information flow is developed in the way of explaining specific subjects and of paraphrasing unfamiliar phrases and expressions. From the results of analysis above, this research can verify that the information structures in the target text of the Vajrachedika-prajna-paramita sutra are changed by specific subjects and terminologies, developed with the new-old information flow by repetitive expressions or word order and reveal the information structures familiar to target culture. It also implies that the translation of the Vajrachedika-prajna-paramita sutra as a religious book needs the message conveyance to take into account the information structures of two languages.Keywords: abstruse terminologies, the information structure, new and old information, old Chinese character-Korean translation
Procedia PDF Downloads 368366 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning
Procedia PDF Downloads 354365 The Use of Building Energy Simulation Software in Case Studies: A Literature Review
Authors: Arman Ameen, Mathias Cehlin
Abstract:
The use of Building Energy Simulation (BES) software has increased in the last two decades, parallel to the development of increased computing power and easy to use software applications. This type of software is primarily used to simulate the energy use and the indoor environment for a building. The rapid development of these types of software has raised their level of user-friendliness, better parameter input options and the increased possibility of analysis, both for a single building component or an entire building. This, in turn, has led to many researchers utilizing BES software in their research in various degrees. The aim of this paper is to carry out a literature review concerning the use of the BES software IDA Indoor Climate and Energy (IDA ICE) in the scientific community. The focus of this paper will be specifically the use of the software for whole building energy simulation, number and types of articles and publications dates, the area of application, types of parameters used, the location of the studied building, type of building, type of analysis and solution methodology. Another aspect that is examined, which is of great interest, is the method of validations regarding the simulation results. The results show that there is an upgoing trend in the use of IDA ICE and that researchers use the software in their research in various degrees depending on case and aim of their research. The satisfactory level of validation of the simulations carried out in these articles varies depending on the type of article and type of analysis.Keywords: building simulation, IDA ICE, literature review, validation
Procedia PDF Downloads 136