Search results for: Research information systems (RIS)
36443 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method
Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek
Abstract:
Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow
Procedia PDF Downloads 13336442 Conceptual Model for Logistics Information System
Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves
Abstract:
Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.Keywords: system, information, conceptual model, logistics
Procedia PDF Downloads 49636441 An Ontology for Semantic Enrichment of RFID Systems
Authors: Haitham S. Hamza, Mohamed Maher, Shourok Alaa, Aya Khattab, Hadeal Ismail, Kamilia Hosny
Abstract:
Radio Frequency Identification (RFID) has become a key technology in the margining concept of Internet of Things (IoT). Naturally, business applications would require the deployment of various RFID systems that are developed by different vendors and use various data formats. This heterogeneity poses a real challenge in developing large-scale IoT systems with RFID as integration is becoming very complex and challenging. Semantic integration is a key approach to deal with this challenge. To do so, ontology for RFID systems need to be developed in order to annotated semantically RFID systems, and hence, facilitate their integration. Accordingly, in this paper, we propose ontology for RFID systems. The proposed ontology can be used to semantically enrich RFID systems, and hence, improve their usage and reasoning. The usage of the proposed ontology is explained through a simple scenario in the health care domain.Keywords: RFID, semantic technology, ontology, sparql query language, heterogeneity
Procedia PDF Downloads 47136440 E-Learning Recommender System Based on Collaborative Filtering and Ontology
Authors: John Tarus, Zhendong Niu, Bakhti Khadidja
Abstract:
In recent years, e-learning recommender systems has attracted great attention as a solution towards addressing the problem of information overload in e-learning environments and providing relevant recommendations to online learners. E-learning recommenders continue to play an increasing educational role in aiding learners to find appropriate learning materials to support the achievement of their learning goals. Although general recommender systems have recorded significant success in solving the problem of information overload in e-commerce domains and providing accurate recommendations, e-learning recommender systems on the other hand still face some issues arising from differences in learner characteristics such as learning style, skill level and study level. Conventional recommendation techniques such as collaborative filtering and content-based deal with only two types of entities namely users and items with their ratings. These conventional recommender systems do not take into account the learner characteristics in their recommendation process. Therefore, conventional recommendation techniques cannot make accurate and personalized recommendations in e-learning environment. In this paper, we propose a recommendation technique combining collaborative filtering and ontology to recommend personalized learning materials to online learners. Ontology is used to incorporate the learner characteristics into the recommendation process alongside the ratings while collaborate filtering predicts ratings and generate recommendations. Furthermore, ontological knowledge is used by the recommender system at the initial stages in the absence of ratings to alleviate the cold-start problem. Evaluation results show that our proposed recommendation technique outperforms collaborative filtering on its own in terms of personalization and recommendation accuracy.Keywords: collaborative filtering, e-learning, ontology, recommender system
Procedia PDF Downloads 37936439 Monitoring and Evaluation of Web-Services Quality and Medium-Term Impact on E-Government Agencies' Efficiency
Authors: A. F. Huseynov, N. T. Mardanov, J. Y. Nakhchivanski
Abstract:
This practical research is aimed to improve the management quality and efficiency of public administration agencies providing e-services. The monitoring system developed will provide continuous review of the websites compliance with the selected indicators, their evaluation based on the selected indicators and ranking of services according to the quality criteria. The responsible departments in the government agencies were surveyed; the questionnaire includes issues of management and feedback, e-services provided, and the application of information systems. By analyzing the main affecting factors and barriers, the recommendations will be given that lead to the relevant decisions to strengthen the state agencies competencies for the management and the provision of their services. Component 1. E-services monitoring system. Three separate monitoring activities are proposed to be executed in parallel: Continuous tracing of e-government sites using built-in web-monitoring program; this program generates several quantitative values which are basically related to the technical characteristics and the performance of websites. The expert assessment of e-government sites in accordance with the two general criteria. Criterion 1. Technical quality of the site. Criterion 2. Usability/accessibility (load, see, use). Each high-level criterion is in turn subdivided into several sub-criteria, such as: the fonts and the color of the background (Is it readable?), W3C coding standards, availability of the Robots.txt and the site map, the search engine, the feedback/contact and the security mechanisms. The on-line survey of the users/citizens – a small group of questions embedded in the e-service websites. The questionnaires comprise of the information concerning navigation, users’ experience with the website (whether it was positive or negative), etc. Automated monitoring of web-sites by its own could not capture the whole evaluation process, and should therefore be seen as a complement to expert’s manual web evaluations. All of the separate results were integrated to provide the complete evaluation picture. Component 2. Assessment of the agencies/departments efficiency in providing e-government services. - the relevant indicators to evaluate the efficiency and the effectiveness of e-services were identified; - the survey was conducted in all the governmental organizations (ministries, committees and agencies) that provide electronic services for the citizens or the businesses; - the quantitative and qualitative measures are covering the following sections of activities: e-governance, e-services, the feedback from the users, the information systems at the agencies’ disposal. Main results: 1. The software program and the set of indicators for internet sites evaluation has been developed and the results of pilot monitoring have been presented. 2. The evaluation of the (internal) efficiency of the e-government agencies based on the survey results with the practical recommendations related to the human potential, the information systems used and e-services provided.Keywords: e-government, web-sites monitoring, survey, internal efficiency
Procedia PDF Downloads 30436438 Transfer of Information Heritage between Algerian Veterinarians and Breeders: Assessment of Information and Communication Technology Using Mobile Phone
Authors: R. Bernaoui, P. Ohly
Abstract:
Our research shows the use of the mobile phone that consolidates the relationship between veterinarians, and that between breeders and veterinarians. On the other hand it asserts that the tool in question is a means of economic development. The results of our survey reveal a positive return to the veterinary community, which shows that the mobile phone has become an effective means of sustainable development through the transfer of a rapid and punctual information inheritance via social networks; including many Internet applications. Our results show that almost all veterinarians use the mobile phone for interprofessional communication. We therefore believe that the use of the mobile phone by livestock operators has greatly improved the working conditions, just as the use of this tool contributes to a better management of the exploitation as long as it allows limit travel but also save time. These results show that we are witnessing a growth in the use of mobile telephony technologies that impact is as much in terms of sustainable development. Allowing access to information, especially technical information, the mobile phone, and Information and Communication of Technology (ICT) in general, give livestock sector players not only security, by limiting losses, but also an efficiency that allows them a better production and productivity.Keywords: algeria, breeder-veterinarian, digital heritage, networking
Procedia PDF Downloads 12036437 Bridging Healthcare Information Systems and Customer Relationship Management for Effective Pandemic Response
Authors: Sharda Kumari
Abstract:
As the Covid-19 pandemic continues to leave its mark on the global business landscape, companies have had to adapt to new realities and find ways to sustain their operations amid social distancing measures, government restrictions, and heightened public health concerns. This unprecedented situation has placed considerable stress on both employees and employers, underscoring the need for innovative approaches to manage the risks associated with Covid-19 transmission in the workplace. In response to these challenges, the pandemic has accelerated the adoption of digital technologies, with an increasing preference for remote interactions and virtual collaboration. Customer relationship management (CRM) systems have risen to prominence as a vital resource for organizations navigating the post-pandemic world, providing a range of benefits that include acquiring new customers, generating insightful consumer data, enhancing customer relationships, and growing market share. In the context of pandemic management, CRM systems offer three primary advantages: (1) integration features that streamline operations and reduce the need for multiple, costly software systems; (2) worldwide accessibility from any internet-enabled device, facilitating efficient remote workforce management during a pandemic; and (3) the capacity for rapid adaptation to changing business conditions, given that most CRM platforms boast a wide array of remotely deployable business growth solutions, a critical attribute when dealing with a dispersed workforce in a pandemic-impacted environment. These advantages highlight the pivotal role of CRM systems in helping organizations remain resilient and adaptive in the face of ongoing global challenges.Keywords: healthcare, CRM, customer relationship management, customer experience, digital transformation, pandemic response, patient monitoring, patient management, healthcare automation, electronic health record, patient billing, healthcare information systems, remote workforce, virtual collaboration, resilience, adaptable business models, integration features, CRM in healthcare, telehealth, pandemic management
Procedia PDF Downloads 10136436 Knowledge Creation Environment in the Iranian Universities: A Case Study
Authors: Mahdi Shaghaghi, Amir Ghaebi, Fariba Ahmadi
Abstract:
Purpose: The main purpose of the present research is to analyze the knowledge creation environment at a Iranian University (Alzahra University) as a typical University in Iran, using a combination of the i-System and Ba models. This study is necessary for understanding the determinants of knowledge creation at Alzahra University as a typical University in Iran. Methodology: To carry out the present research, which is an applied study in terms of purpose, a descriptive survey method was used. In this study, a combination of the i-System and Ba models has been used to analyze the knowledge creation environment at Alzahra University. i-System consists of 5 constructs including intervention (input), intelligence (process), involvement (process), imagination (process), and integration (output). The Ba environment has three pillars, namely the infrastructure, the agent, and the information. The integration of these two models resulted in 11 constructs which were as follows: intervention (input), infrastructure-intelligence, agent-intelligence, information-intelligence (process); infrastructure-involvement, agent-involvement, information-involvement (process); infrastructure-imagination, agent-imagination, information-imagination (process); and integration (output). These 11 constructs were incorporated into a 52-statement questionnaire and the validity and reliability of the questionnaire were examined and confirmed. The statistical population included the faculty members of Alzahra University (344 people). A total of 181 participants were selected through the stratified random sampling technique. The descriptive statistics, binomial test, regression analysis, and structural equation modeling (SEM) methods were also utilized to analyze the data. Findings: The research findings indicated that among the 11 research constructs, the levels of intervention, information-intelligence, infrastructure-involvement, and agent-imagination constructs were average and not acceptable. The levels of infrastructure-intelligence and information-imagination constructs ranged from average to low. The levels of agent-intelligence and information-involvement constructs were also completely average. The level of infrastructure-imagination construct was average to high and thus was considered acceptable. The levels of agent-involvement and integration constructs were above average and were in a highly acceptable condition. Furthermore, the regression analysis results indicated that only two constructs, viz. the information-imagination and agent-involvement constructs, positively and significantly correlate with the integration construct. The results of the structural equation modeling also revealed that the intervention, intelligence, and involvement constructs are related to the integration construct with the complete mediation of imagination. Discussion and conclusion: The present research suggests that knowledge creation at Alzahra University relatively complies with the combination of the i-System and Ba models. Unlike this model, the intervention, intelligence, and involvement constructs are not directly related to the integration construct and this seems to have three implications: 1) the information sources are not frequently used to assess and identify the research biases; 2) problem finding is probably of less concern at the end of studies and at the time of assessment and validation; 3) the involvement of others has a smaller role in the summarization, assessment, and validation of the research.Keywords: i-System, Ba model , knowledge creation , knowledge management, knowledge creation environment, Iranian Universities
Procedia PDF Downloads 10136435 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 27036434 The Effect of the Marketing Culture on Improving the E-service Quality: A Comparative Study of Foreign and Domestic Information Technology Companies in the Arab Republic of Egypt
Authors: E. Elgohary, R. Abdelazyz
Abstract:
The research aims to clarify the effect of the marketing culture on improving the e-service quality for foreign and domestic information technology companies in the Arab Republic of Egypt. So the researcher sought to include the dimensions of the marketing culture, which are (customer service, management style, sales mission, internal communications, technology, wages and rewards, innovation) as measures of marketing culture for its effect on improving the e-service quality in this research. The research population consists of employees and customers of the companies under study. The research problem was the following question: What is the effect of the actual application of marketing culture on improving the e-service quality? To answer that, three main hypotheses were adopted, and they were tested by statistical means for the data collected through a questionnaire prepared and distributed for this purpose. Accordingly, the research presented a set of results, the most important of which are: the need to pay attention to the dimensions of the marketing culture to improve the e-service quality, foreign companies were the most popular companies in applying the marketing culture compared to local companies. The research also recommends designing a system to continuously measure the performance of electronic service providers and work on spreading the culture of innovation among employees, linking reward programs to the extent of commitment to applying the elements of marketing culture while doing business.Keywords: marketing culture, e-service quality, measurement models, quality measurements
Procedia PDF Downloads 22836433 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies
Authors: Mark Andrew
Abstract:
Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.Keywords: forecasting, technology futures, uncertainty, complexity
Procedia PDF Downloads 11436432 Numerical Study of UV Irradiation Effect on Air Disinfection Systems
Authors: H. Shokouhmand, M. Degheh, B. Sajadi, H. Sobhani
Abstract:
The induct ultraviolet germicidal irradiation (UVGI) systems are broadly used nowadays and their utilization is widened every day. Even though these systems are not applicable individually, they are very suitable supplements for the traditional filtration systems. The amount of inactivated microorganisms is dependent on the air velocity, lamp power, fluence rate distribution, and also germicidal susceptibility of microorganisms. In this paper, these factors are investigated utilizing an air-microorganism two-phase numerical model. The eulerian-lagrangian method was used to have more detailed information on the history of each particle. The UVGI system was modeled in three steps including: 1) modeling the air flow, 2) modeling the discrete phase of particles, 3) modeling the UV intensity field, and 4) modeling the particle inactivation. The results from modeling different lamp arrangements and powers showed that the system functions better at more homogeneous irradiation distribution. Since increasing the air flow rate of the device results in increasing of particle inactivation rate, the optimal air velocity shall be adjusted in accordance with the microorganism production rate, and the air quality requirement using the curves represented in this paper.Keywords: CFD, microorganism, two-phase flow, ultraviolet germicidal irradiation
Procedia PDF Downloads 32936431 Geographic Information Systems as a Tool to Support the Sustainable Development Goals
Authors: Gulnara N. Nabiyeva, Stephen M. Wheeler
Abstract:
Geographic Information Systems (GIS) is a multipurpose computer-based tool that provides a sophisticated ability to map and analyze data on different spatial layers. However, GIS is far more easily applied in some policy areas than others. This paper seeks to determine the areas of sustainable development, including environmental, economic, and social dimensions, where GIS has been used to date to support efforts to implement the United Nations Sustainable Development Goals (SDGs), and to discuss potential areas where it might be used more. Based on an extensive analysis of published literature, we ranked the SDGs according to how frequently GIS has been used to study related policy. We found that SDG#15 “Life on Land” is most often addressed with GIS, following by SDG#11 “Sustainable Cities and Communities”, and SDG#13 “Climate Action”. On the other hand, we determined that SDG#2 “Zero Hunger”, SDG#8 “Decent Work and Economic Growth”, and SDG#16 “Peace, Justice, and Strong Institutions” are least addressed with GIS. The paper outlines some specific ways that GIS might be applied to the SDGs least linked to this tool currently.Keywords: GIS, GIS application, sustainable community development, sustainable development goals
Procedia PDF Downloads 13336430 Sensory Ethnography and Interaction Design in Immersive Higher Education
Authors: Anna-Kaisa Sjolund
Abstract:
The doctoral thesis examines interaction design and sensory ethnography as tools to create immersive education environments. In recent years, there has been increasing interest and discussions among researchers and educators on immersive education like augmented reality tools, virtual glasses and the possibilities to utilize them in education at all levels. Using virtual devices as learning environments it is possible to create multisensory learning environments. Sensory ethnography in this study refers to the way of the senses consider the impact on the information dynamics in immersive learning environments. The past decade has seen the rapid development of virtual world research and virtual ethnography. Christine Hine's Virtual Ethnography offers an anthropological explanation of net behavior and communication change. Despite her groundbreaking work, time has changed the users’ communication style and brought new solutions to do ethnographical research. The virtual reality with all its new potential has come to the fore and considering all the senses. Movie and image have played an important role in cultural research for centuries, only the focus has changed in different times and in a different field of research. According to Karin Becker, the role of image in our society is information flow and she found two meanings what the research of visual culture is. The images and pictures are the artifacts of visual culture. Images can be viewed as a symbolic language that allows digital storytelling. Combining the sense of sight, but also the other senses, such as hear, touch, taste, smell, balance, the use of a virtual learning environment offers students a way to more easily absorb large amounts of information. It offers also for teachers’ different ways to produce study material. In this article using sensory ethnography as research tool approaches the core question. Sensory ethnography is used to describe information dynamics in immersive environment through interaction design. Immersive education environment is understood as three-dimensional, interactive learning environment, where the audiovisual aspects are central, but all senses can be taken into consideration. When designing learning environments or any digital service, interaction design is always needed. The question what is interaction design is justified, because there is no simple or consistent idea of what is the interaction design or how it can be used as a research method or whether it is only a description of practical actions. When discussing immersive learning environments or their construction, consideration should be given to interaction design and sensory ethnography.Keywords: immersive education, sensory ethnography, interaction design, information dynamics
Procedia PDF Downloads 13736429 Understanding Workplace Behavior through Organizational Culture and Complex Adaptive Systems Theory
Authors: Péter Restás, Andrea Czibor, Zsolt Péter Szabó
Abstract:
Purpose: This article aims to rethink the phenomena of employee behavior as a product of a system. Both organizational culture and Complex Adaptive Systems (CAS) theory emphasize that individual behavior depends on the specific system and the unique organizational culture. These two major theories are both represented in the field of organizational studies; however, they are rarely used together for the comprehensive understanding of workplace behavior. Methodology: By reviewing the literature we use key concepts stemming from organizational culture and CAS theory in order to show the similarities between these theories and create an enriched understanding of employee behavior. Findings: a) Workplace behavior is defined here as social cognition issue. b) Organizations are discussed here as complex systems, and cultures which drive and dictate the cognitive processes of agents in the system. c) Culture gives CAS theory a context which lets us see organizations not just as ever-changing and unpredictable, but as such systems that aim to create and maintain stability by recurring behavior. Conclusion: Applying the knowledge from culture and CAS theory sheds light on our present understanding of employee behavior, also emphasizes the importance of novel ways in organizational research and management.Keywords: complex adaptive systems theory, employee behavior, organizational culture, stability
Procedia PDF Downloads 41536428 Valuing Cultural Ecosystem Services of Natural Treatment Systems Using Crowdsourced Data
Authors: Andrea Ghermandi
Abstract:
Natural treatment systems such as constructed wetlands and waste stabilization ponds are increasingly used to treat water and wastewater from a variety of sources, including stormwater and polluted surface water. The provision of ancillary benefits in the form of cultural ecosystem services makes these systems unique among water and wastewater treatment technologies and greatly contributes to determine their potential role in promoting sustainable water management practices. A quantitative analysis of these benefits, however, has been lacking in the literature. Here, a critical assessment of the recreational and educational benefits in natural treatment systems is provided, which combines observed public use from a survey of managers and operators with estimated public use as obtained using geotagged photos from social media as a proxy for visitation rates. Geographic Information Systems (GIS) are used to characterize the spatial boundaries of 273 natural treatment systems worldwide. Such boundaries are used as input for the Application Program Interfaces (APIs) of two popular photo-sharing websites (Flickr and Panoramio) in order to derive the number of photo-user-days, i.e., the number of yearly visits by individual photo users in each site. The adequateness and predictive power of four univariate calibration models using the crowdsourced data as a proxy for visitation are evaluated. A high correlation is found between photo-user-days and observed annual visitors (Pearson's r = 0.811; p-value < 0.001; N = 62). Standardized Major Axis (SMA) regression is found to outperform Ordinary Least Squares regression and count data models in terms of predictive power insofar as standard verification statistics – such as the root mean square error of prediction (RMSEP), the mean absolute error of prediction (MAEP), the reduction of error (RE), and the coefficient of efficiency (CE) – are concerned. The SMA regression model is used to estimate the intensity of public use in all 273 natural treatment systems. System type, influent water quality, and area are found to statistically affect public use, consistently with a priori expectations. Publicly available information regarding the home location of the sampled visitors is derived from their social media profiles and used to infer the distance they are willing to travel to visit the natural treatment systems in the database. Such information is analyzed using the travel cost method to derive monetary estimates of the recreational benefits of the investigated natural treatment systems. Overall, the findings confirm the opportunities arising from an integrated design and management of natural treatment systems, which combines the objectives of water quality enhancement and provision of cultural ecosystem services through public use in a multi-functional approach and compatibly with the need to protect public health.Keywords: constructed wetlands, cultural ecosystem services, ecological engineering, waste stabilization ponds
Procedia PDF Downloads 18036427 Is There a Group of "Digital Natives" at Secondary Schools?
Authors: L. Janská, J. Kubrický
Abstract:
The article describes a research focused on the influence of the information and communication technology (ICT) on the pupils' learning. The investigation deals with the influences that distinguish between the group of pupils influenced by ICT and the group of pupils not influenced by ICT. The group influenced by ICT should evince a different approach in number of areas (in managing of two and more activities at once, in a quick orientation and searching for information on the Internet, in an ability to quickly and effectively assess the data sources, in the assessment of attitudes and opinions of the other users of the network, in critical thinking, in the preference to work in teams, in the sharing of information and personal data via the virtual social networking, in insisting on the immediate reaction on their every action etc.).Keywords: ICT influence, digital natives, pupil´s learning
Procedia PDF Downloads 29136426 An Ultrasonic Signal Processing System for Tomographic Imaging of Reinforced Concrete Structures
Authors: Edwin Forero-Garcia, Jaime Vitola, Brayan Cardenas, Johan Casagua
Abstract:
This research article presents the integration of electronic and computer systems, which developed an ultrasonic signal processing system that performs the capture, adaptation, and analog-digital conversion to later carry out its processing and visualization. The capture and adaptation of the signal were carried out from the design and implementation of an analog electronic system distributed in stages: 1. Coupling of impedances; 2. Analog filter; 3. Signal amplifier. After the signal conditioning was carried out, the ultrasonic information was digitized using a digital microcontroller to carry out its respective processing. The digital processing of the signals was carried out in MATLAB software for the elaboration of A-Scan, B and D-Scan types of ultrasonic images. Then, advanced processing was performed using the SAFT technique to improve the resolution of the Scan-B-type images. Thus, the information from the ultrasonic images was displayed in a user interface developed in .Net with Visual Studio. For the validation of the system, ultrasonic signals were acquired, and in this way, the non-invasive inspection of the structures was carried out and thus able to identify the existing pathologies in them.Keywords: acquisition, signal processing, ultrasound, SAFT, HMI
Procedia PDF Downloads 10736425 An Ontology for Smart Learning Environments in Music Education
Authors: Konstantinos Sofianos, Michail Stefanidakis
Abstract:
Nowadays, despite the great advances in technology, most educational frameworks lack a strong educational design basis. E-learning has become prevalent, but it faces various challenges such as student isolation and lack of quality in the learning process. An intelligent learning system provides a student with educational material according to their learning background and learning preferences. It records full information about the student, such as demographic information, learning styles, and academic performance. This information allows the system to be fully adapted to the student’s needs. In this paper, we propose a framework and an ontology for music education, consisting of the learner model and all elements of the learning process (learning objects, teaching methods, learning activities, assessment). This framework can be integrated into an intelligent learning system and used for music education in schools for the development of professional skills and beyond.Keywords: intelligent learning systems, e-learning, music education, ontology, semantic web
Procedia PDF Downloads 13936424 An Ensemble Learning Method for Applying Particle Swarm Optimization Algorithms to Systems Engineering Problems
Authors: Ken Hampshire, Thomas Mazzuchi, Shahram Sarkani
Abstract:
As a subset of metaheuristics, nature-inspired optimization algorithms such as particle swarm optimization (PSO) have shown promise both in solving intractable problems and in their extensibility to novel problem formulations due to their general approach requiring few assumptions. Unfortunately, single instantiations of algorithms require detailed tuning of parameters and cannot be proven to be best suited to a particular illustrative problem on account of the “no free lunch” (NFL) theorem. Using these algorithms in real-world problems requires exquisite knowledge of the many techniques and is not conducive to reconciling the various approaches to given classes of problems. This research aims to present a unified view of PSO-based approaches from the perspective of relevant systems engineering problems, with the express purpose of then eliciting the best solution for any problem formulation in an ensemble learning bucket of models approach. The central hypothesis of the research is that extending the PSO algorithms found in the literature to real-world optimization problems requires a general ensemble-based method for all problem formulations but a specific implementation and solution for any instance. The main results are a problem-based literature survey and a general method to find more globally optimal solutions for any systems engineering optimization problem.Keywords: particle swarm optimization, nature-inspired optimization, metaheuristics, systems engineering, ensemble learning
Procedia PDF Downloads 9836423 Research on Urban Point of Interest Generalization Method Based on Mapping Presentation
Authors: Chengming Li, Yong Yin, Peipei Guo, Xiaoli Liu
Abstract:
Without taking account of the attribute richness of POI (point of interest) data and spatial distribution limited by roads, a POI generalization method considering both attribute information and spatial distribution has been proposed against the existing point generalization algorithm merely focusing on overall information of point groups. Hierarchical characteristic of urban POI information expression has been firstly analyzed to point out the measurement feature of the corresponding hierarchy. On this basis, an urban POI generalizing strategy has been put forward: POIs urban road network have been divided into three distribution pattern; corresponding generalization methods have been proposed according to the characteristic of POI data in different distribution patterns. Experimental results showed that the method taking into account both attribute information and spatial distribution characteristics of POI can better implement urban POI generalization in the mapping presentation.Keywords: POI, road network, selection method, spatial information expression, distribution pattern
Procedia PDF Downloads 41036422 Basics of SCADA Security: A Technical Approach
Authors: Michał Witas
Abstract:
This paper presents a technical approach to analysis of security of SCADA systems. Main goal of the paper is to make SCADA administrators aware of risks resulting from SCADA systems usage and to familiarize with methods that can be adopt to existing or planned system, to increase overall system security level. Because SCADA based systems become a industrial standard, more attention should be paid to the security of that systems. Industrial Control Systems (ICS) like SCADA are responsible for controlling crucial aspects of wide range of industrial processes. In pair with that responsibility, goes a lot of money that can be earned or lost – this fact is main reason of increased interest of attackers. Additionally ICS are often responsible for maintaining resources strategic from the point of view of national economy, like electricity (including nuclear power plants), heating, water resources or military facilities, so they can be targets of terrorist cybernetic attacks. Without proper risk analysis and management, vulnerabilities resulting from the usage of SCADA can be easily exploited by potential attacker. Paper is based mostly on own experience in systems security, gathered during academic studies and professional work in international company. As title suggests, it will cover only basics of topic, because every of points mentioned in the document can be base for additional research and papers.Keywords: denial of service, SCADA, security policy, distributed network
Procedia PDF Downloads 37136421 Agro-Measures Influence Soil Physical Parameters in Alternative Farming
Authors: Laura Masilionyte, Danute Jablonskyte-Rasce, Kestutis Venslauskas, Zita Kriauciuniene
Abstract:
Alternative farming systems are used to cultivate high-quality food products and sustain the viability and fertility of the soil. Plant nutrition in all ecosystems depends not only on fertilization intensity or soil richness in organic matter but also on soil physical parameters –bulk density, structure, pores with the optimum moisture and air ratio available to plants. The field experiments of alternative (sustainable and organic) farming systems were conducted at Joniskelis Experimental Station of the Lithuanian Research Centre for Agriculture and Forestry in 2006–2016. The soil of the experimental site was Endocalcari-Endohypogleyic Cambisol (CMg-n-w-can). In alternative farming systems, farmyard manure, straw and catch crops for green manure were used for fertilization both in the soil with low and moderate humus contents. It had a more significant effect in the 0–20 cm depth layer on soil moisture than on other physical soil properties. In the agricultural systems, where catch crops were grown, soil physical characteristics did not differ significantly before their biomass incorporation, except for the moisture content, which was lower in rainy periods and higher in drier periods than in the soil of farming systems without catch crops. Soil bulk density and porosity in the topsoil layer were more dependent on soil humus content than on agricultural measures used: in the soil with moderate humus content, compared with the soil with low humus content, bulk density was by 1.4% lower, and porosity by 1.8% higher. The research findings allow to make improvements in alternative farming systems by choosing appropriate combinations of organic fertilizers and catch crops that have a sustainable effect on soil and maintain the sustainability of soil productivity parameters. Rational fertilization systems, securing the stability of soil productivity parameters and crop rotation productivity will promote the development of organic agriculture.Keywords: agro-measures, soil physical parameters, organic farming, sustainable farming
Procedia PDF Downloads 12736420 Customers’ Intention to Use Electronic Payment System for Purchasing
Authors: Wanida Suwunniponth
Abstract:
The purpose of this research was to study the factors of characteristic of business, website quality and trust affected intention to use electronic payment systems for online purchasing. This survey research used questionnaire as a tool to collect the data of 300 customers who purchased online products and used an electronic payment system. The descriptive statistics and multiple regression analysis were used to analyze data. The results revealed that customers had a good opinion towards the characteristic of the business and website quality. However, they have a moderate opinion towards trust and intention to repurchase. In addition, the characteristics of the business affected the purchase intention the most, followed by website quality and the trust with statistical significance at 0.05 level. For particular, the terms of reputation, communication, information quality, perceived risk and word of mouth affected the intention to use the electronic payment system. In contrast, the terms of size, system quality and service quality did not affect intention to use an electronic payment system.Keywords: electronic payment, intention, online purchasing, trust
Procedia PDF Downloads 24636419 Efficient Neural and Fuzzy Models for the Identification of Dynamical Systems
Authors: Aouiche Abdelaziz, Soudani Mouhamed Salah, Aouiche El Moundhe
Abstract:
The present paper addresses the utilization of Artificial Neural Networks (ANNs) and Fuzzy Inference Systems (FISs) for the identification and control of dynamical systems with some degree of uncertainty. Because ANNs and FISs have an inherent ability to approximate functions and to adapt to changes in input and parameters, they can be used to control systems too complex for linear controllers. In this work, we show how ANNs and FISs can be put in order to form nets that can learn from external data. In sequence, it is presented structures of inputs that can be used along with ANNs and FISs to model non-linear systems. Four systems were used to test the identification and control of the structures proposed. The results show the ANNs and FISs (Back Propagation Algorithm) used were efficient in modeling and controlling the non-linear plants.Keywords: non-linear systems, fuzzy set Models, neural network, control law
Procedia PDF Downloads 21236418 Dynamic Ambulance Deployment to Reduce Ambulance Response Times Using Geographic Information Systems
Authors: Masoud Swalehe, Semra Günay
Abstract:
Developed countries are losing many lives to non-communicable diseases as compared to their developing counterparts. The effects of these diseases are mostly sudden and manifest at a very short time prior to death or a dangerous attack and this has consolidated the significance of emergency medical system (EMS) as one of the vital areas of healthcare service delivery. The primary objective of this research is to reduce ambulance response times (RT) of Eskişehir province EMS since a number of studies have established a relationship between ambulance response times and survival chances of patients especially out of hospital cardiac arrest (OHCA) victims. It has been found out that patients who receive out of hospital medical attention in few (4) minutes after cardiac arrest because of low ambulance response times stand higher chances of survival than their counterparts who take longer times (more than 12 minutes) to receive out of hospital medical care because of higher ambulance response times. The study will make use of geographic information systems (GIS) technology to dynamically reallocate ambulance resources according to demand and time so as to reduce ambulance response times. Geospatial-time distribution of ambulance calls (demand) will be used as a basis for optimal ambulance deployment using system status management (SSM) strategy to achieve much demand coverage with the same number of ambulance resources to cause response time reduction. Drive-time polygons will be used to come up with time specific facility coverage areas and suggesting additional facility candidate sites where ambulance resources can be moved to serve higher demands making use of network analysis techniques. Emergency Ambulance calls’ data from 1st January 2014 to 31st December 2014 obtained from Eskişehir province health directorate will be used in this study. This study will focus on the reduction of ambulance response times which is a key Emergency Medical Services performance indicator.Keywords: emergency medical services, system status management, ambulance response times, geographic information system, geospatial-time distribution, out of hospital cardiac arrest
Procedia PDF Downloads 30036417 Synthetic Classicism: A Machine Learning Approach to the Recognition and Design of Circular Pavilions
Authors: Federico Garrido, Mostafa El Hayani, Ahmed Shams
Abstract:
The exploration of the potential of artificial intelligence (AI) in architecture is still embryonic, however, its latent capacity to change design disciplines is significant. 'Synthetic Classism' is a research project that questions the underlying aspects of classically organized architecture not just in aesthetic terms but also from a geometrical and morphological point of view, intending to generate new architectural information using historical examples as source material. The main aim of this paper is to explore the uses of artificial intelligence and machine learning algorithms in architectural design while creating a coherent narrative to be contained within a design process. The purpose is twofold: on one hand, to develop and train machine learning algorithms to produce architectural information of small pavilions and on the other, to synthesize new information from previous architectural drawings. These algorithms intend to 'interpret' graphical information from each pavilion and then generate new information from it. The procedure, once these algorithms are trained, is the following: parting from a line profile, a synthetic 'front view' of a pavilion is generated, then using it as a source material, an isometric view is created from it, and finally, a top view is produced. Thanks to GAN algorithms, it is also possible to generate Front and Isometric views without any graphical input as well. The final intention of the research is to produce isometric views out of historical information, such as the pavilions from Sebastiano Serlio, James Gibbs, or John Soane. The idea is to create and interpret new information not just in terms of historical reconstruction but also to explore AI as a novel tool in the narrative of a creative design process. This research also challenges the idea of the role of algorithmic design associated with efficiency or fitness while embracing the possibility of a creative collaboration between artificial intelligence and a human designer. Hence the double feature of this research, both analytical and creative, first by synthesizing images based on a given dataset and then by generating new architectural information from historical references. We find that the possibility of creatively understand and manipulate historic (and synthetic) information will be a key feature in future innovative design processes. Finally, the main question that we propose is whether an AI could be used not just to create an original and innovative group of simple buildings but also to explore the possibility of fostering a novel architectural sensibility grounded on the specificities on the architectural dataset, either historic, human-made or synthetic.Keywords: architecture, central pavilions, classicism, machine learning
Procedia PDF Downloads 14036416 Analysis of Threats in Interoperability of Medical Devices
Authors: M. Sandhya, R. M. Madhumitha, Sharmila Sankar
Abstract:
Interoperable medical devices (IMDs) face threats due to the increased attack surface accessible by interoperability and the corresponding infrastructure. Initiating networking and coordination functionalities primarily modify medical systems' security properties. Understanding the threats is a vital first step in ultimately crafting security solutions for such systems. The key to this problem is coming up with some common types of threats or attacks with those of security and privacy, and providing this information as a roadmap. This paper analyses the security issues in interoperability of devices and presents the main types of threats that have to be considered to build a secured system.Keywords: interoperability, threats, attacks, medical devices
Procedia PDF Downloads 33336415 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies
Authors: Stefano Fantin
Abstract:
The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.Keywords: cybersecurity, vulnerability, European Union, Japan
Procedia PDF Downloads 15636414 Investigation of Information Security Incident Management Based on International Standard ISO/IEC 27002 in Educational Hospitals in 2014
Authors: Nahid Tavakoli, Asghar Ehteshami, Akbar Hassanzadeh, Fatemeh Amini
Abstract:
Introduction: The Information security incident management guidelines was been developed to help hospitals to meet their information security event and incident management requirements. The purpose of this Study was to investigate on Information Security Incident Management in Isfahan’s educational hospitals in accordance to ISO/IEC 27002 standards. Methods: This was a cross-sectional study to investigate on Information Security Incident Management of educational hospitals in 2014. Based on ISO/IEC 27002 standards, two checklists were applied to check the compliance with standards on Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements. One inspector was trained to carry out the assessments in the hospitals. The data was analyzed by SPSS. Findings: In general the score of compliance Information Security Incident Management requirements in two steps; Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements was %60. There was the significant difference in various compliance levels among the hospitals (p-value