Abstracts | Computer and Systems Engineering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 417

World Academy of Science, Engineering and Technology

[Computer and Systems Engineering]

Online ISSN : 1307-6892

147 A Framework for Automating Software Testing: A Practical Approach

Authors: Ana Paula Cavalcanti Furtado, Silvio Meira

Abstract:

Context: The quality of a software product can be directly influenced by the quality of its development process. Therefore, immature or ad-hoc test processes are means that are unsuited for introducing systematic test automation, and should not be used to support improving the quality of software. Objective: In order to conduct this research, the benefits and limitations of and gaps in automating software testing had to be assessed in order to identify the best practices and to propose a strategy for systematically introducing test automation into software development processes. Method: To conduct this research, an exploratory bibliographical survey was undertaken so as to underpin the search by theory and the recent literature. After defining the proposal, two case studies were conducted so as to analyze the proposal in a real-world environment. In addition, the proposal was also assessed through a focus group with specialists in the field. Results: The proposal of a Framework for Automating Software Testing (FAST), which is a theoretical framework consisting of a hierarchical structure to introduce test automation. Conclusion: The findings of this research showed that the absence of systematic processes is one of the factors that hinder the introduction of test automation. Based on the results of the case studies, FAST can be considered as a satisfactory alternative that lies within the scope of introducing and maintaining test automation in software development.

Keywords: software process improvement, software quality, software testing, test automation

Procedia PDF Downloads 142
146 Conflicts Identification Approach among Stakeholders in Goal-Oriented Requirements Analysis

Authors: Muhammad Suhaib

Abstract:

Requirements Analysis are the most important part of software Engineering for both system application development, and project requirements. Conflicts often arise during the requirements gathering and analysis phase. This research aims to identify conflicts during the requirements gathering phase in software development life cycle, Research, Development, and Technology converted the world into a global village. During requirements elicitation/gathering phase it’s very difficult to understand the main objective of stakeholders, after completion of requirements elicitation task final results are used for Software Requirements Specification (SRS), SRS is the highly important outcome of the requirements analysis phase. this is the foundation between the developers and stakeholders or customers, proposed methodology will be helpful to identify those conflicts in a very easy manner during the initial phase of the project.

Keywords: goal oriented requirements analysis, conflicts identification model, requirements analysis, requirements engineering

Procedia PDF Downloads 131
145 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment

Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay

Abstract:

Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.

Keywords: machine learning, system performance, performance metrics, IoT, edge

Procedia PDF Downloads 192
144 Forecasting Thermal Energy Demand in District Heating and Cooling Systems Using Long Short-Term Memory Neural Networks

Authors: Kostas Kouvaris, Anastasia Eleftheriou, Georgios A. Sarantitis, Apostolos Chondronasios

Abstract:

To achieve the objective of almost zero carbon energy solutions by 2050, the EU needs to accelerate the development of integrated, highly efficient and environmentally friendly solutions. In this direction, district heating and cooling (DHC) emerges as a viable and more efficient alternative to conventional, decentralized heating and cooling systems, enabling a combination of more efficient renewable and competitive energy supplies. In this paper, we develop a forecasting tool for near real-time local weather and thermal energy demand predictions for an entire DHC network. In this fashion, we are able to extend the functionality and to improve the energy efficiency of the DHC network by predicting and adjusting the heat load that is distributed from the heat generation plant to the connected buildings by the heat pipe network. Two case-studies are considered; one for Vransko, Slovenia and one for Montpellier, France. The data consists of i) local weather data, such as humidity, temperature, and precipitation, ii) weather forecast data, such as the outdoor temperature and iii) DHC operational parameters, such as the mass flow rate, supply and return temperature. The external temperature is found to be the most important energy-related variable for space conditioning, and thus it is used as an external parameter for the energy demand models. For the development of the forecasting tool, we use state-of-the-art deep neural networks and more specifically, recurrent networks with long-short-term memory cells, which are able to capture complex non-linear relations among temporal variables. Firstly, we develop models to forecast outdoor temperatures for the next 24 hours using local weather data for each case-study. Subsequently, we develop models to forecast thermal demand for the same period, taking under consideration past energy demand values as well as the predicted temperature values from the weather forecasting models. The contributions to the scientific and industrial community are three-fold, and the empirical results are highly encouraging. First, we are able to predict future thermal demand levels for the two locations under consideration with minimal errors. Second, we examine the impact of the outdoor temperature on the predictive ability of the models and how the accuracy of the energy demand forecasts decreases with the forecast horizon. Third, we extend the relevant literature with a new dataset of thermal demand and examine the performance and applicability of machine learning techniques to solve real-world problems. Overall, the solution proposed in this paper is in accordance with EU targets, providing an automated smart energy management system, decreasing human errors and reducing excessive energy production.

Keywords: machine learning, LSTMs, district heating and cooling system, thermal demand

Procedia PDF Downloads 139
143 Text Similarity in Vector Space Models: A Comparative Study

Authors: Omid Shahmirzadi, Adam Lugowski, Kenneth Younge

Abstract:

Automatic measurement of semantic text similarity is an important task in natural language processing. In this paper, we evaluate the performance of different vector space models to perform this task. We address the real-world problem of modeling patent-to-patent similarity and compare TFIDF (and related extensions), topic models (e.g., latent semantic indexing), and neural models (e.g., paragraph vectors). Contrary to expectations, the added computational cost of text embedding methods is justified only when: 1) the target text is condensed; and 2) the similarity comparison is trivial. Otherwise, TFIDF performs surprisingly well in other cases: in particular for longer and more technical texts or for making finer-grained distinctions between nearest neighbors. Unexpectedly, extensions to the TFIDF method, such as adding noun phrases or calculating term weights incrementally, were not helpful in our context.

Keywords: big data, patent, text embedding, text similarity, vector space model

Procedia PDF Downloads 169
142 Stress and Coping Strategies: A Correlational Analysis to Profiling Maladaptive Behaviors at Work

Authors: Silvia Riva, Ezekiel Chinyio

Abstract:

Introduction: Workers in different sectors are prone to stress at varying levels. They also respond to stress in different ways. An inspiration was to study stress development amongst workers in a work dangerous setting (Construction Industry) as well as how they cope with specific stress incidences. Objective: The overarching objective of the study was to study and correlate between stress and coping strategies. The research was conducted in an organizational industrial setting, and its findings on the coping actions of construction workers are reported in this article. Methods: An online cross-sectional survey was conducted with 80 participants aged 18-62. These were working for three different construction organizations in the West Midland region of the UK. Their coping actions were assessed using the COPE Inventory (Carver, 2013) instrument while the level of stress was assessed by the Perceived Stress Scale (Cohen, 1994). Results: Out of 80 workers (20 female, 25%, mean age 40.66), positive reinterpretation (M=4.15, SD=2.60) and active coping (M=4.18, SD=2.55) were the two most adaptive strategies reported by the workers while the most frequent maladaptive behavior was mental disengagement (M=3.62, SD=2.25). Among the maladaptive tactics, alcohol and drug abuse was a significant moderator in stress reactions (t=6.12, p=.000). Conclusion: Some maladaptive strategies are adopted by construction workers to cope with stress. So, it could be argued that programs of stress prevention and control in the construction industry have a basis to develop solutions that can improve and strengthen effective interventions when workers are stressed or getting stressed.

Keywords: coping, organization, strategies, stress

Procedia PDF Downloads 208
141 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection

Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye

Abstract:

Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.

Keywords: connected-component, projection-profile, segmentation, text-line

Procedia PDF Downloads 121
140 Distributed Cost-Based Scheduling in Cloud Computing Environment

Authors: Rupali, Anil Kumar Jaiswal

Abstract:

Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc.  Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively.  Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.

Keywords: physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model

Procedia PDF Downloads 165
139 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 137
138 Probabilistic Approach to Contrast Theoretical Predictions from a Public Corruption Game Using Bayesian Networks

Authors: Jaime E. Fernandez, Pablo J. Valverde

Abstract:

This paper presents a methodological approach that aims to contrast/validate theoretical results from a corruption network game through probabilistic analysis of simulated microdata using Bayesian Networks (BNs). The research develops a public corruption model in a game theory framework. Theoretical results suggest a series of 'optimal settings' of model's exogenous parameters that boost the emergence of corruption. The paper contrasts these outcomes with probabilistic inference results based on BNs adjusted over simulated microdata. Principal findings indicate that probabilistic reasoning based on BNs significantly improves parameter specification and causal analysis in a public corruption game.

Keywords: Bayesian networks, probabilistic reasoning, public corruption, theoretical games

Procedia PDF Downloads 204
137 Development of Tools for Multi Vehicles Simulation with Robot Operating System and ArduPilot

Authors: Pierre Kancir, Jean-Philippe Diguet, Marc Sevaux

Abstract:

One of the main difficulties in developing multi-robot systems (MRS) is related to the simulation and testing tools available. Indeed, if the differences between simulations and real robots are too significant, the transition from the simulation to the robot won’t be possible without another long development phase and won’t permit to validate the simulation. Moreover, the testing of different algorithmic solutions or modifications of robots requires a strong knowledge of current tools and a significant development time. Therefore, the availability of tools for MRS, mainly with flying drones, is crucial to enable the industrial emergence of these systems. This research aims to present the most commonly used tools for MRS simulations and their main shortcomings and presents complementary tools to improve the productivity of designers in the development of multi-vehicle solutions focused on a fast learning curve and rapid transition from simulations to real usage. The proposed contributions are based on existing open source tools as Gazebo simulator combined with ROS (Robot Operating System) and the open-source multi-platform autopilot ArduPilot to bring them to a broad audience.

Keywords: ROS, ArduPilot, MRS, simulation, drones, Gazebo

Procedia PDF Downloads 205
136 Agile Software Development Implementation in Developing a Diet Tracker Mobile Application

Authors: Dwi Puspita Sari, Gulnur Baltabayeva, Nadia Salman, Maxut Toleuov, Vijay Kanabar

Abstract:

Technology era drives people to use mobile phone to support their daily life activities. Technology development has a rapid phase which pushes the IT company to adjust any technology changes in order to fulfill customer’s satisfaction. As a result of that, many companies in the USA emerged from systematics software development approach to agile software development approach in developing systems and applications to develop many mobile phone applications in a short phase to fulfill user’s needs. As a systematic approach is considered as time consuming, costly, and too risky, agile software development has become a more popular approach to use for developing software including mobile applications. This paper reflects a short-term project to develop a diet tracker mobile application using agile software development that focused on applying scrum framework in the development process.

Keywords: agile software development, scrum, diet tracker, mobile application

Procedia PDF Downloads 251
135 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 158
134 Modeling of Water Erosion in the M'Goun Watershed Using OpenGIS Software

Authors: M. Khal, Ab. Algouti, A. Algouti

Abstract:

Water erosion is the major cause of the erosion that shapes the earth's surface. Modeling water erosion requires the use of software and GIS programs, commercial or closed source. The very high prices for commercial GIS licenses, motivates users and researchers to find open source software as relevant and applicable as the proprietary GIS. The objective of this study is the modeling of water erosion and the hydrogeological and morphophysical characterization of the Oued M'Goun watershed (southern flank of the Central High Atlas) developed by free programs of GIS. The very pertinent results are obtained by executing tasks and algorithms in a simple and easy way. Thus, the various geoscientific and geostatistical analyzes of a digital elevation model (SRTM 30 m resolution) and their combination with the treatments and interpretation of satellite imagery information allowed us to characterize the region studied and to map the area most vulnerable to water erosion.

Keywords: central High-Atlas, hydrogeology, M’Goun watershed, OpenGis, water erosion

Procedia PDF Downloads 157
133 Virtualization of Production Using Digital Twin Technology

Authors: Bohuslava Juhasova, Igor Halenar, Martin Juhas

Abstract:

The contribution deals with the current situation in modern manufacturing enterprises, which is affected by digital virtualization of different parts of the production process. The overview part of this article points to the fact, that wide informatization of all areas causes substitution of real elements and relationships between them with their digital, often virtual images, in real practice. Key characteristics of the systems implemented using digital twin technology along with essential conditions for intelligent products deployment were identified across many published studies. The goal was to propose a template for the production system realization using digital twin technology as a supplement to standardized concepts for Industry 4.0. The main resulting idea leads to the statement that the current trend of implementation of the new technologies and ways of communication between industrial facilities erases the boundaries between the real environment and the virtual world.

Keywords: communication, digital twin, Industry 4.0, simulation, virtualization

Procedia PDF Downloads 244
132 A Comparative Analysis Approach Based on Fuzzy AHP, TOPSIS and PROMETHEE for the Selection Problem of GSCM Solutions

Authors: Omar Boutkhoum, Mohamed Hanine, Abdessadek Bendarag

Abstract:

Sustainable economic growth is nowadays driving firms to extend toward the adoption of many green supply chain management (GSCM) solutions. However, the evaluation and selection of these solutions is a matter of concern that needs very serious decisions, involving complexity owing to the presence of various associated factors. To resolve this problem, a comparative analysis approach based on multi-criteria decision-making methods is proposed for adequate evaluation of sustainable supply chain management solutions. In the present paper, we propose an integrated decision-making model based on FAHP (Fuzzy Analytic Hierarchy Process), TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) and PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) to contribute to a better understanding and development of new sustainable strategies for industrial organizations. Due to the varied importance of the selected criteria, FAHP is used to identify the evaluation criteria and assign the importance weights for each criterion, while TOPSIS and PROMETHEE methods employ these weighted criteria as inputs to evaluate and rank the alternatives. The main objective is to provide a comparative analysis based on TOPSIS and PROMETHEE processes to help make sound and reasoned decisions related to the selection problem of GSCM solution.

Keywords: GSCM solutions, multi-criteria analysis, decision support system, TOPSIS, FAHP, PROMETHEE

Procedia PDF Downloads 160
131 Metamodel for Artefacts in Service Engineering Analysis and Design

Authors: Purnomo Yustianto, Robin Doss

Abstract:

As a process of developing a service system, the term ‘service engineering’ evolves in scope and definition. To achieve an integrated understanding of the process, a general framework and an ontology are required. This paper extends a previously built service engineering framework by exploring metamodels for the framework artefacts based on a foundational ontology and a metamodel landscape. The first part of this paper presents a correlation map between the proposed framework with the ontology as a form of evaluation for the conceptual coverage of the framework. The mapping also serves to characterize the artefacts to be produced for each activity in the framework. The second part describes potential metamodels to be used, from the metamodel landscape, as alternative formats of the framework artefacts. The results suggest that the framework sufficiently covers the ontological concepts, both from general service context and software service context. The metamodel exploration enriches the suggested artefact format from the original eighteen formats to thirty metamodel alternatives.

Keywords: artefact, framework, service, metamodel

Procedia PDF Downloads 203
130 Framework for Socio-Technical Issues in Requirements Engineering for Developing Resilient Machine Vision Systems Using Levels of Automation through the Lifecycle

Authors: Ryan Messina, Mehedi Hasan

Abstract:

This research is to examine the impacts of using data to generate performance requirements for automation in visual inspections using machine vision. These situations are intended for design and how projects can smooth the transfer of tacit knowledge to using an algorithm. We have proposed a framework when specifying machine vision systems. This framework utilizes varying levels of automation as contingency planning to reduce data processing complexity. Using data assists in extracting tacit knowledge from those who can perform the manual tasks to assist design the system; this means that real data from the system is always referenced and minimizes errors between participating parties. We propose using three indicators to know if the project has a high risk of failing to meet requirements related to accuracy and reliability. All systems tested achieved a better integration into operations after applying the framework.

Keywords: automation, contingency planning, continuous engineering, control theory, machine vision, system requirements, system thinking

Procedia PDF Downloads 202
129 Usability in E-Commerce Websites: Results of Eye Tracking Evaluations

Authors: Beste Kaysı, Yasemin Topaloğlu

Abstract:

Usability is one of the most important quality attributes for web-based information systems. Specifically, for e-commerce applications, usability becomes more prominent. In this study, we aimed to explore the features that experienced users seek in e-commerce applications. We used eye tracking method in evaluations. Eye movement data are obtained from the eye-tracking method and analyzed based on task completion time, number of fixations, as well as heat map and gaze plot measures. The results of the analysis show that the eye movements of participants' are too static in certain areas and their areas of interest are scattered in many different places. It has been determined that this causes users to fail to complete their transactions. According to the findings, we outlined the issues to improve the usability of e-commerce websites. Then we propose solutions to identify the issues. In this way, it is expected that e-commerce sites will be developed which will make experienced users more satisfied.

Keywords: e-commerce websites, eye tracking method, usability, website evaluations

Procedia PDF Downloads 179
128 A Formal Property Verification for Aspect-Oriented Programs in Software Development

Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb

Abstract:

Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.

Keywords: aspect-oriented programming, control flow graph, property verification, satisfiability modulo theories

Procedia PDF Downloads 172
127 Building Information Modeling Acting as Protagonist and Link between the Virtual Environment and the Real-World for Efficiency in Building Production

Authors: Cristiane R. Magalhaes

Abstract:

Advances in Information and Communication Technologies (ICT) have led to changes in different sectors particularly in architecture, engineering, construction, and operation (AECO) industry. In this context, the advent of BIM (Building Information Modeling) has brought a number of opportunities in the field of the digital architectural design process bringing integrated design concepts that impact on the development, elaboration, coordination, and management of ventures. The project scope has begun to contemplate, from its original stage, the third dimension, by means of virtual environments (VEs), composed of models containing different specialties, substituting the two-dimensional products. The possibility to simulate the construction process of a venture in a VE starts at the beginning of the design process offering, through new technologies, many possibilities beyond geometrical digital modeling. This is a significant change and relates not only to form, but also to how information is appropriated in architectural and engineering models and exchanged among professionals. In order to achieve the main objective of this work, the Design Science Research Method will be adopted to elaborate an artifact containing strategies for the application and use of ICTs from BIM flows, with pre-construction cut-off to the execution of the building. This article intends to discuss and investigate how BIM can be extended to the site acting as a protagonist and link between the Virtual Environments and the Real-World, as well as its contribution to the integration of the value chain and the consequent increase of efficiency in the production of the building. The virtualization of the design process has reached high levels of development through the use of BIM. Therefore it is essential that the lessons learned with the virtual models be transposed to the actual building production increasing precision and efficiency. Thus, this paper discusses how the Fourth Industrial Revolution has impacted on property developments and how BIM could be the propellant acting as the main fuel and link between the virtual environment and the real production for the structuring of flows, information management and efficiency in this process. The results obtained are partial and not definite up to the date of this publication. This research is part of a doctoral thesis development, which focuses on the discussion of the impact of digital transformation in the construction of residential buildings in Brazil.

Keywords: building information modeling, building production, digital transformation, ICT

Procedia PDF Downloads 119
126 Computerized Scoring System: A Stethoscope to Understand Consumer's Emotion through His or Her Feedback

Authors: Chen Yang, Jun Hu, Ping Li, Lili Xue

Abstract:

Most companies pay careful attention to consumer feedback collection, so it is popular to find the ‘feedback’ button of all kinds of mobile apps. Yet it is much more changeling to analyze these feedback texts and to catch the true feelings of a consumer regarding either a problem or a complimentary of consumers who hands out the feedback. Especially to the Chinese content, it is possible that; in one context the Chinese feedback expresses positive feedback, but in the other context, the same Chinese feedback may be a negative one. For example, in Chinese, the feedback 'operating with loudness' works well with both refrigerator and stereo system. Apparently, this feedback towards a refrigerator shows negative feedback; however, the same feedback is positive towards a stereo system. By introducing Bradley, M. and Lang, P.'s Affective Norms for English Text (ANET) theory and Bucci W.’s Referential Activity (RA) theory, we, usability researchers at Pingan, are able to decipher the feedback and to find the hidden feelings behind the content. We subtract 2 disciplines ‘valence’ and ‘dominance’ out of 3 of ANET and 2 disciplines ‘concreteness’ and ‘specificity’ out of 4 of RA to organize our own rating system with a scale of 1 to 5 points. This rating system enables us to judge the feelings/emotion behind each feedback, and it works well with both single word/phrase and a whole paragraph. The result of the rating reflects the strength of the feeling/emotion of the consumer when he/she is typing the feedback. In our daily work, we first require a consumer to answer the net promoter score (NPS) before writing the feedback, so we can determine the feedback is positive or negative. Secondly, we code the feedback content according to company problematic list, which contains 200 problematic items. In this way, we are able to collect the data that how many feedbacks left by the consumer belong to one typical problem. Thirdly, we rate each feedback based on the rating system mentioned above to illustrate the strength of the feeling/emotion when our consumer writes the feedback. In this way, we actually obtain two kinds of data 1) the portion, which means how many feedbacks are ascribed into one problematic item and 2) the severity, how strong the negative feeling/emotion is when the consumer is writing this feedback. By crossing these two, and introducing the portion into X-axis and severity into Y-axis, we are able to find which typical problem gets the high score in both portion and severity. The higher the score of a problem has, the more urgent a problem is supposed to be solved as it means more people write stronger negative feelings in feedbacks regarding this problem. Moreover, by introducing hidden Markov model to program our rating system, we are able to computerize the scoring system and are able to process thousands of feedback in a short period of time, which is efficient and accurate enough for the industrial purpose.

Keywords: computerized scoring system, feeling/emotion of consumer feedback, referential activity, text mining

Procedia PDF Downloads 173
125 Cloud Enterprise Application Provider Selection Model for the Small and Medium Enterprise: A Pilot Study

Authors: Rowland R. Ogunrinde, Yusmadi Y. Jusoh, Noraini Che Pa, Wan Nurhayati W. Rahman, Azizol B. Abdullah

Abstract:

Enterprise Applications (EAs) aid the organizations achieve operational excellence and competitive advantage. Over time, most Small and Medium Enterprises (SMEs), which are known to be the major drivers of most thriving global economies, use the costly on-premise versions of these applications thereby making business difficult to competitively thrive in the same market environment with their large enterprise counterparts. The advent of cloud computing presents the SMEs an affordable offer and great opportunities as such EAs can be cloud-hosted and rented on a pay-per-use basis which does not require huge initial capital. However, as there are numerous Cloud Service Providers (CSPs) offering EAs as Software-as-a-Service (SaaS), there is a challenge of choosing a suitable provider with Quality of Service (QoS) that meet the organizations’ customized requirements. The proposed model takes care of that and goes a step further to select the most affordable among a selected few of the CSPs. In the earlier stage, before developing the instrument and conducting the pilot test, the researchers conducted a structured interview with three experts to validate the proposed model. In conclusion, the validity and reliability of the instrument were tested through experts, typical respondents, and analyzed with SPSS 22. Results confirmed the validity of the proposed model and the validity and reliability of the instrument.

Keywords: cloud service provider, enterprise application, quality of service, selection criteria, small and medium enterprise

Procedia PDF Downloads 175
124 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization

Authors: Agria Rhamdhan

Abstract:

WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.

Keywords: forensics, triage, visualization, WhatsApp

Procedia PDF Downloads 166
123 The Suitability of Agile Practices in Healthcare Industry with Regard to Healthcare Regulations

Authors: Mahmood Alsaadi, Alexei Lisitsa

Abstract:

Nowadays, medical devices rely completely on software whether as whole software or as embedded software, therefore, the organization that develops medical device software can benefit from adopting agile practices. Using agile practices in healthcare software development industries would bring benefits such as producing a product of a high-quality with low cost and in short period. However, medical device software development companies faced challenges in adopting agile practices. These due to the gaps that exist between agile practices and the requirements of healthcare regulations such as documentation, traceability, and formality. This research paper will conduct a study to investigate the adoption rate of agile practice in medical device software development, and they will extract and outline the requirements of healthcare regulations such as Food and Drug Administration (FDA), Health Insurance Portability and Accountability Act (HIPAA), and Medical Device Directive (MDD) that affect directly or indirectly on software development life cycle. Moreover, this research paper will evaluate the suitability of using agile practices in healthcare industries by analyzing the most popular agile practices such as eXtream Programming (XP), Scrum, and Feature-Driven Development (FDD) from healthcare industry point of view and in comparison with the requirements of healthcare regulations. Finally, the authors propose an agile mixture model that consists of different practices from different agile methods. As result, the adoption rate of agile practices in healthcare industries still low and agile practices should enhance with regard to requirements of the healthcare regulations in order to be used in healthcare software development organizations. Therefore, the proposed agile mixture model may assist in minimizing the gaps existing between healthcare regulations and agile practices and increase the adoption rate in the healthcare industry. As this research paper part of the ongoing project, an evaluation of agile mixture model will be conducted in the near future.

Keywords: adoption of agile, agile gaps, agile mixture model, agile practices, healthcare regulations

Procedia PDF Downloads 232
122 Application of Deep Learning in Colorization of LiDAR-Derived Intensity Images

Authors: Edgardo V. Gubatanga Jr., Mark Joshua Salvacion

Abstract:

Most aerial LiDAR systems have accompanying aerial cameras in order to capture not only the terrain of the surveyed area but also its true-color appearance. However, the presence of atmospheric clouds, poor lighting conditions, and aerial camera problems during an aerial survey may cause absence of aerial photographs. These leave areas having terrain information but lacking aerial photographs. Intensity images can be derived from LiDAR data but they are only grayscale images. A deep learning model is developed to create a complex function in a form of a deep neural network relating the pixel values of LiDAR-derived intensity images and true-color images. This complex function can then be used to predict the true-color images of a certain area using intensity images from LiDAR data. The predicted true-color images do not necessarily need to be accurate compared to the real world. They are only intended to look realistic so that they can be used as base maps.

Keywords: aerial LiDAR, colorization, deep learning, intensity images

Procedia PDF Downloads 158
121 Measuring Banks’ Antifragility via Fuzzy Logic

Authors: Danielle Sandler dos Passos, Helder Coelho, Flávia Mori Sarti

Abstract:

Analysing the world banking sector, we realize that traditional risk measurement methodologies no longer reflect the actual scenario with uncertainty and leave out events that can change the dynamics of markets. Considering this, regulators and financial institutions began to search more realistic models. The aim is to include external influences and interdependencies between agents, to describe and measure the operationalization of these complex systems and their risks in a more coherent and credible way. Within this context, X-Events are more frequent than assumed and, with uncertainties and constant changes, the concept of antifragility starts to gain great prominence in comparison to others methodologies of risk management. It is very useful to analyse whether a system succumbs (fragile), resists (robust) or gets benefits (antifragile) from disorder and stress. Thus, this work proposes the creation of the Banking Antifragility Index (BAI), which is based on the calculation of a triangular fuzzy number – to "quantify" qualitative criteria linked to antifragility.

Keywords: adaptive complex systems, X-Events, risk management, antifragility, banking antifragility index, triangular fuzzy number

Procedia PDF Downloads 176
120 A Cloud-Based Federated Identity Management in Europe

Authors: Jesus Carretero, Mario Vasile, Guillermo Izquierdo, Javier Garcia-Blas

Abstract:

Currently, there is a so called ‘identity crisis’ in cybersecurity caused by the substantial security, privacy and usability shortcomings encountered in existing systems for identity management. Federated Identity Management (FIM) could be solution for this crisis, as it is a method that facilitates management of identity processes and policies among collaborating entities without enforcing a global consistency, that is difficult to achieve when there are ID legacy systems. To cope with this problem, the Connecting Europe Facility (CEF) initiative proposed in 2014 a federated solution in anticipation of the adoption of the Regulation (EU) N°910/2014, the so-called eIDAS Regulation. At present, a network of eIDAS Nodes is being deployed at European level to allow that every citizen recognized by a member state is to be recognized within the trust network at European level, enabling the consumption of services in other member states that, until now were not allowed, or whose concession was tedious. This is a very ambitious approach, since it tends to enable cross-border authentication of Member States citizens without the need to unify the authentication method (eID Scheme) of the member state in question. However, this federation is currently managed by member states and it is initially applied only to citizens and public organizations. The goal of this paper is to present the results of a European Project, named eID@Cloud, that focuses on the integration of eID in 5 cloud platforms belonging to authentication service providers of different EU Member States to act as Service Providers (SP) for private entities. We propose an initiative based on a private eID Scheme both for natural and legal persons. The methodology followed in the eID@Cloud project is that each Identity Provider (IdP) is subscribed to an eIDAS Node Connector, requesting for authentication, that is subscribed to an eIDAS Node Proxy Service, issuing authentication assertions. To cope with high loads, load balancing is supported in the eIDAS Node. The eID@Cloud project is still going on, but we already have some important outcomes. First, we have deployed the federation identity nodes and tested it from the security and performance point of view. The pilot prototype has shown the feasibility of deploying this kind of systems, ensuring good performance due to the replication of the eIDAS nodes and the load balance mechanism. Second, our solution avoids the propagation of identity data out of the native domain of the user or entity being identified, which avoids problems well known in cybersecurity due to network interception, man in the middle attack, etc. Last, but not least, this system allows to connect any country or collectivity easily, providing incremental development of the network and avoiding difficult political negotiations to agree on a single authentication format (which would be a major stopper).

Keywords: cybersecurity, identity federation, trust, user authentication

Procedia PDF Downloads 164
119 An Intelligent Traffic Management System Based on the WiFi and Bluetooth Sensing

Authors: Hamed Hossein Afshari, Shahrzad Jalali, Amir Hossein Ghods, Bijan Raahemi

Abstract:

This paper introduces an automated clustering solution that applies to WiFi/Bluetooth sensing data and is later used for traffic management applications. The paper initially summarizes a number of clustering approaches and thereafter shows their performance for noise removal. In this context, clustering is used to recognize WiFi and Bluetooth MAC addresses that belong to passengers traveling by a public urban transit bus. The main objective is to build an intelligent system that automatically filters out MAC addresses that belong to persons located outside the bus for different routes in the city of Ottawa. The proposed intelligent system alleviates the need for defining restrictive thresholds that however reduces the accuracy as well as the range of applicability of the solution for different routes. This paper moreover discusses the performance benefits of the presented clustering approaches in terms of the accuracy, time and space complexity, and the ease of use. Note that results of clustering can further be used for the purpose of the origin-destination estimation of individual passengers, predicting the traffic load, and intelligent management of urban bus schedules.

Keywords: WiFi-Bluetooth sensing, cluster analysis, artificial intelligence, traffic management

Procedia PDF Downloads 238
118 Impacts of Urbanization on Forest and Agriculture Areas in Savannakhet Province, Lao People's Democratic Republic

Authors: Chittana Phompila

Abstract:

The current increased population pushes increasing demands for natural resources and living space. In Laos, urban areas have been expanding rapidly in recent years. The rapid urbanization can have negative impacts on landscapes, including forest and agriculture lands. The primary objective of this research were to map current urban areas in a large city in Savannakhet province, in Laos, 2) to compare changes in urbanization between 1990 and 2018, and 3) to estimate forest and agriculture areas lost due to expansions of urban areas during the last over twenty years within study area. Landsat 8 data was used and existing GIS data was collected including spatial data on rivers, lakes, roads, vegetated areas and other land use/land covers). GIS data was obtained from the government sectors. Object based classification (OBC) approach was applied in ECognition for image processing and analysis of urban area using. Historical data from other Landsat instruments (Landsat 5 and 7) were used to allow us comparing changes in urbanization in 1990, 2000, 2010 and 2018 in this study area. Only three main land cover classes were focused and classified, namely forest, agriculture and urban areas. Change detection approach was applied to illustrate changes in built-up areas in these periods. Our study shows that the overall accuracy of map was 95% assessed, kappa~ 0.8. It is found that that there is an ineffective control over forest and land-use conversions from forests and agriculture to urban areas in many main cities across the province. A large area of agriculture and forest has been decreased due to this conversion. Uncontrolled urban expansion and inappropriate land use planning can lead to creating a pressure in our resource utilisation. As consequence, it can lead to food insecurity and national economic downturn in a long term.

Keywords: urbanisation, forest cover, agriculture areas, Landsat 8 imagery

Procedia PDF Downloads 156