Search results for: smart software tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9313

Search results for: smart software tools

7273 Proposed Solutions Based on Affective Computing

Authors: Diego Adrian Cardenas Jorge, Gerardo Mirando Guisado, Alfredo Barrientos Padilla

Abstract:

A system based on Affective Computing can detect and interpret human information like voice, facial expressions and body movement to detect emotions and execute a corresponding response. This data is important due to the fact that a person can communicate more effectively with emotions than can be possible with words. This information can be processed through technological components like Facial Recognition, Gait Recognition or Gesture Recognition. As of now, solutions proposed using this technology only consider one component at a given moment. This research investigation proposes two solutions based on Affective Computing taking into account more than one component for emotion detection. The proposals reflect the levels of dependency between hardware devices and software, as well as the interaction process between the system and the user which implies the development of scenarios where both proposals will be put to the test in a live environment. Both solutions are to be developed in code by software engineers to prove the feasibility. To validate the impact on society and business interest, interviews with stakeholders are conducted with an investment mind set where each solution is labeled on a scale of 1 through 5, being one a minimum possible investment and 5 the maximum.

Keywords: affective computing, emotions, emotion detection, face recognition, gait recognition

Procedia PDF Downloads 361
7272 Incidence and Prevalence of Dry Eye Syndrome in Different Occupational Sector of Society

Authors: Vergeena Varghese, G. Gajalakshmi, Jayarajini Vasanth

Abstract:

The present study deals with the indication of prevalence of dry eye and evaluates environmental risk factors attributed to dry eye in different occupational sectors. 240 subjects above 20 years and below 45 years of age were screened for dry eye. Mcmonnies dry eye questionnaire based history and Schirmer’s test were used to diagnose dry eye. For Schirmer’s test Whatman strip and paracaine drop used as an anesthetic. Subject’s demographics include age, sex, smoking, alcoholism, occupation history and working environment. Out of a total of 240 subjects, 52 subjects were positive for dry eye syndrome (21.7%). The highest prevalence of dry eye syndrome in software sector was 14subjects (26.9%) out of a total of 40 subjects. In the construction sector, the prevalence of dry eye syndrome had 12 subjects (23.1%) out of 40 subjects and 9 subjects (17.3%) out of 40 subjects in agriculture sector. 7 subjects (13.5%) who had dry eye out of 40 subjects in the transport sector and in industrial 6 subjects (11.5%). In a normal sector, this was taken as control group had dry eye in 4 subjects (7.7%) out of 40 subjects. We also found the prevalence of dry eye in OS was higher than OD. Dry eye is a most common ocular condition. The highest prevalence of dry eye syndrome in software sector was 14 members than other sector. There was a significant correlation between environmental and occupational factors to cause dry eye. Excessive exposure to sunlight, wind, high temperature, and air pollution, electromagnetic radiation are the factors affect the tear film and ocular surface causing the dry eye syndrome.

Keywords: DES – dry eye syndrome, Mcmonnies dry eye questionnaire, schirmer’s test, whatman vstrip

Procedia PDF Downloads 458
7271 Automated Localization of Palpebral Conjunctiva and Hemoglobin Determination Using Smart Phone Camera

Authors: Faraz Tahir, M. Usman Akram, Albab Ahmad Khan, Mujahid Abbass, Ahmad Tariq, Nuzhat Qaiser

Abstract:

The objective of this study was to evaluate the Degree of anemia by taking the picture of the palpebral conjunctiva using Smartphone Camera. We have first localized the region of interest from the image and then extracted certain features from that Region of interest and trained SVM classifier on those features and then, as a result, our system classifies the image in real-time on their level of hemoglobin. The proposed system has given an accuracy of 70%. We have trained our classifier on a locally gathered dataset of 30 patients.

Keywords: anemia, palpebral conjunctiva, SVM, smartphone

Procedia PDF Downloads 494
7270 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved

Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben

Abstract:

Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.

Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons

Procedia PDF Downloads 385
7269 Democracy in Gaming: An Artificial Neural Network Based Approach towards Rule Evolution

Authors: Nelvin Joseph, K. Krishna Milan Rao, Praveen Dwarakanath

Abstract:

The explosive growth of Smart phones around the world has led to the shift of the primary engagement tool for entertainment from traditional consoles and music players to an all integrated device. Augmented Reality is the next big shift in bringing in a new dimension to the play. The paper explores the construct and working of the community engine in Delta T – an Augmented Reality game that allows users to evolve rules in the game basis collective bargaining mirroring democracy even in a gaming world.

Keywords: augmented reality, artificial neural networks, mobile application, human computer interaction, community engine

Procedia PDF Downloads 326
7268 Using AI Based Software as an Assessment Aid for University Engineering Assignments

Authors: Waleed Al-Nuaimy, Luke Anastassiou, Manjinder Kainth

Abstract:

As the process of teaching has evolved with the advent of new technologies over the ages, so has the process of learning. Educators have perpetually found themselves on the lookout for new technology-enhanced methods of teaching in order to increase learning efficiency and decrease ever expanding workloads. Shortly after the invention of the internet, web-based learning started to pick up in the late 1990s and educators quickly found that the process of providing learning material and marking assignments could change thanks to the connectivity offered by the internet. With the creation of early web-based virtual learning environments (VLEs) such as SPIDER and Blackboard, it soon became apparent that VLEs resulted in higher reported computer self-efficacy among students, but at the cost of students being less satisfied with the learning process . It may be argued that the impersonal nature of VLEs, and their limited functionality may have been the leading factors contributing to this reported dissatisfaction. To this day, often faced with the prospects of assigning colossal engineering cohorts their homework and assessments, educators may frequently choose optimally curated assessment formats, such as multiple-choice quizzes and numerical answer input boxes, so that automated grading software embedded in the VLEs can save time and mark student submissions instantaneously. A crucial skill that is meant to be learnt during most science and engineering undergraduate degrees is gaining the confidence in using, solving and deriving mathematical equations. Equations underpin a significant portion of the topics taught in many STEM subjects, and it is in homework assignments and assessments that this understanding is tested. It is not hard to see that this can become challenging if the majority of assignment formats students are engaging with are multiple-choice questions, and educators end up with a reduced perspective of their students’ ability to manipulate equations. Artificial intelligence (AI) has in recent times been shown to be an important consideration for many technologies. In our paper, we explore the use of new AI based software designed to work in conjunction with current VLEs. Using our experience with the software, we discuss its potential to solve a selection of problems ranging from impersonality to the reduction of educator workloads by speeding up the marking process. We examine the software’s potential to increase learning efficiency through its features which claim to allow more customized and higher-quality feedback. We investigate the usability of features allowing students to input equation derivations in a range of different forms, and discuss relevant observations associated with these input methods. Furthermore, we make ethical considerations and discuss potential drawbacks to the software, including the extent to which optical character recognition (OCR) could play a part in the perpetuation of errors and create disagreements between student intent and their submitted assignment answers. It is the intention of the authors that this study will be useful as an example of the implementation of AI in a practical assessment scenario insofar as serving as a springboard for further considerations and studies that utilise AI in the setting and marking of science and engineering assignments.

Keywords: engineering education, assessment, artificial intelligence, optical character recognition (OCR)

Procedia PDF Downloads 118
7267 Usability Testing on Information Design through Single-Lens Wearable Device

Authors: Jae-Hyun Choi, Sung-Soo Bae, Sangyoung Yoon, Hong-Ku Yun, Jiyoung Kwahk

Abstract:

This study was conducted to investigate the effect of ocular dominance on recognition performance using a single-lens smart display designed for cycling. A total of 36 bicycle riders who have been cycling consistently were recruited and participated in the experiment. The participants were asked to perform tasks riding a bicycle on a stationary stand for safety reasons. Independent variables of interest include ocular dominance, bike usage, age group, and information layout. Recognition time (i.e., the time required to identify specific information measured with an eye-tracker), error rate (i.e. false answer or failure to identify the information in 5 seconds), and user preference scores were measured and statistical tests were conducted to identify significant results. Recognition time and error ratio showed significant difference by ocular dominance factor, while the preference score did not. Recognition time was faster when the single-lens see-through display on the dominant eye (average 1.12sec) than on the non-dominant eye (average 1.38sec). Error ratio of the information recognition task was significantly lower when the see-through display was worn on the dominant eye (average 4.86%) than on the non-dominant eye (average 14.04%). The interaction effect of ocular dominance and age group was significant with respect to recognition time and error ratio. The recognition time of the users in their 40s was significantly longer than the other age groups when the display was placed on the non-dominant eye, while no difference was observed on the dominant eye. Error ratio also showed the same pattern. Although no difference was observed for the main effect of ocular dominance and bike usage, the interaction effect between the two variables was significant with respect to preference score. Preference score of daily bike users was higher when the display was placed on the dominant eye, whereas participants who use bikes for leisure purposes showed the opposite preference patterns. It was found more effective and efficient to wear a see-through display on the dominant eye than on the non-dominant eye, although user preference was not affected by ocular dominance. It is recommended to wear a see-through display on the dominant eye since it is safer by helping the user recognize the presented information faster and more accurately, even if the user may not notice the difference.

Keywords: eye tracking, information recognition, ocular dominance, smart headware, wearable device

Procedia PDF Downloads 267
7266 Passive Seismic in Hydrogeological Prospecting: The Case Study from Hard Rock and Alluvium Plain

Authors: Prarabdh Tiwari, M. Vidya Sagar, K. Bhima Raju, Joy Choudhury, Subash Chandra, E. Nagaiah, Shakeel Ahmed

Abstract:

Passive seismic, a wavefield interferometric imaging, low cost and rapid tool for subsurface investigation is used for various geotechnical purposes such as hydrocarbon exploration, seismic microzonation, etc. With the recent advancement, its application has also been extended to groundwater exploration by means of finding the bedrock depth. Council of Scientific & Industrial Research (CSIR)-National Geophysical Research Institute (NGRI) has experimented passive seismic studies along with electrical resistivity tomography for groundwater in hard rock (Choutuppal, Hyderabad). Passive Seismic with Electrical Resistivity (ERT) can give more clear 2-D subsurface image for Groundwater Exploration in Hard Rock area. Passive seismic data were collected using a Tromino, a three-component broadband seismometer, to measure background ambient noise and processed using GRILLA software. The passive seismic results are found corroborating with ERT (Electrical Resistivity Tomography) results. For data acquisition purpose, Tromino was kept over 30 locations consist recording of 20 minutes at each station. These location shows strong resonance frequency peak, suggesting good impedance contrast between different subsurface layers (ex. Mica rich Laminated layer, Weathered layer, granite, etc.) This paper presents signature of passive seismic for hard rock terrain. It has been found that passive seismic has potential application for formation characterization and can be used as an alternative tool for delineating litho-stratification in an urban condition where electrical and electromagnetic tools cannot be applied due to high cultural noise. In addition to its general application in combination with electrical and electromagnetic methods can improve the interpreted subsurface model.

Keywords: passive seismic, resonant frequency, Tromino, GRILLA

Procedia PDF Downloads 181
7265 The Role of Industrial Design in Fashion

Authors: Rojean Ghafariasar, Leili Nosrati

Abstract:

The article introduces the categories and characteristics of cross-design, respectively, between industry and industry designers, artists, brands and brands, science, technology, and fashion. It focuses on the combination of technology and fashion cross-design methods, corresponding case studies on the combination of new technology fabrics, fashion design, smart devices, and also 3D printing technology, emphasizing the integration and application value of technology and fashion. The document also introduces design elements into fashion design through scientific and technological intelligence, promoting fashion innovation as well as research and development of new materials and functions, and incubates an ecosystem for the fashion industry through science and technology.

Keywords: fashion, design, industrial design, crossover design

Procedia PDF Downloads 77
7264 Trial Version of a Systematic Material Selection Tool in Building Element Design

Authors: Mine Koyaz, M. Cem Altun

Abstract:

Selection of the materials satisfying the expected performances is significantly important for any design. Today, with the constantly evolving and developing technologies, the material options are so wide that the necessity of the use of some support tools in the selection process is arising. Therefore, as a sub process of building element design, a systematic material selection tool is developed, that defines four main steps of the material selection; definition, research, comparison and decision. The main purpose of the tool is being an educational instrument that would show a methodic way of material selection in architectural detailing for the use of architecture students. The tool predefines the possible uses of various material databases and other sources of information on material properties. Hence, it is to be used as a guidance for designers, especially with a limited material knowledge and experience. The material selection tool not only embraces technical properties of materials related with building elements’ functional requirements, but also its sensual properties related with the identity of design and its environmental impacts with respect to the sustainability of the design. The method followed in the development of the tool has two main sections; first the examination and application of the existing methods and second the development of trial versions and their applications. Within the scope of the existing methods; design support tools, methodic approaches for the building element design and material selection process, material properties, material databases, methodic approaches for the decision making process are examined. The existing methods are applied by architecture students and newly graduate architects through different design problems. With respect to the results of these applications, strong and weak sides of the existing material selection tools are presented. A main flow chart of the material selection tool has been developed with the objective to apply the strong aspects of the existing methods and develop their weak sides. Through different stages, a different aspect of the material selection process is investigated and the tool took its final form. Systematic material selection tool, within the building element design process, guides the users with a minimum background information, to practically and accurately determine the ideal material that is to be chosen, satisfying the needs of their design. The tool has a flexible structure that answers different needs of different designs and designers. The trial version issued in this paper shows one of the paths that could be followed and illustrates its application over a design problem.

Keywords: architectural education, building element design, material selection tool, systematic approach

Procedia PDF Downloads 339
7263 Outsourcing the Front End of Innovation

Authors: B. Likar, K. Širok

Abstract:

The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology"- a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.

Keywords: creativity, distance learning, front end, innovation, problem

Procedia PDF Downloads 324
7262 Design and Development of the Force Plate for the Study of Driving-Point Biodynamic Responses

Authors: Vikas Kumar, V. H. Saran, Arpit Mathur, Avik Kathuria

Abstract:

The evaluation of biodynamic responses of the human body to whole body vibration exposure is necessary to quantify the exposure effects. A force plate model has been designed with the help of CAD software, which was investigated by performing the modal, stress and strain analysis using finite element approach in the software. The results of the modal, stress and strain analysis were under the limits for measurements of biodynamic responses to whole body vibration. The physical model of the force plate was manufactured and fixed to the vibration simulator and further used in the experimentation for the evaluation of apparent mass responses of the ten recruited subjects standing in an erect posture exposed to vertical whole body vibration. The platform was excited with sinusoidal vibration at vibration magnitude: 1.0 and 1.5 m/s2 rms at different frequency of 2, 3, 4, 5, 6, 8, 10, 12.5, 16 and 20 Hz. The results of magnitude of normalised apparent mass have shown the trend observed in the many past studies. The peak in the normalised apparent mass has been observed at 4 & 5 Hz frequency of vertical whole body vibration. The nonlinearity with respect to vibration magnitude has been also observed in the normalised apparent mass responses.

Keywords: whole body vibration, apparent mass, modeling, force plate

Procedia PDF Downloads 403
7261 Walkability with the Use of Mobile Apps

Authors: Dimitra Riza

Abstract:

This paper examines different ways of exploring a city by using smart phones' applications while walking, and the way this new attitude will change our perception of the urban environment. By referring to various examples of such applications we will consider options and possibilities that open up with new technologies, their advantages and disadvantages, as well as ways of experiencing and interpreting the urban environment. The widespread use of smart phones gave access to information, maps, knowledge, etc. at all times and places. The city tourism marketing takes advantage of this event and promotes the city's attractions through technology. Mobile mediated walking tours, provide new possibilities and modify the way we used to explore cities, for instance by giving directions proper to find easily destinations, by displaying our exact location on the map, by creating our own tours through picking points of interest and interconnecting them to create a route. These apps act as interactive ones, as they filter the user's interests, movements, etc. Discovering a city on foot and visiting interesting sites and landmarks, became very easy, and has been revolutionized through the help of navigational and other applications. In contrast to the re-invention of the city as suggested by the Baudelaire's Flâneur in the 19th century, or to the construction of situations by the Situationists in 60s, the new technological means do not allow people to "get lost", as these follow and record our moves. In the case of strolling or drifting around the city, the option of "getting lost" is desired, as the goal is not the "wayfinding" or the destination, but it is the experience of walking itself. Getting lost is not always about dislocation, but it is about getting a feeling, free of the urban environment while experiencing it. So, on the one hand, walking is considered to be a physical and embodied experience, as the observer becomes an actor and participates with all his senses in the city activities. On the other hand, the use of a screen turns out to become a disembodied experience of the urban environment, as we perceive it in a fragmented and distanced way. Relations with the city are similar to Alberti’s isolated viewer, detached from any urban stage. The smartphone, even if we are present, acts as a mediator: we interact directly with it and indirectly with the environment. Contrary to the Flaneur and to the Situationists, who discovered the city with their own bodies, today the body itself is being detached from that experience. While contemporary cities turn out to become more walkable, the new technological applications tend to open out all possibilities in order to explore them by suggesting multiple routes. Exploration becomes easier, but Perception changes.

Keywords: body, experience, mobile apps, walking

Procedia PDF Downloads 405
7260 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014

Authors: Alexiou Dimitra, Fragkaki Maria

Abstract:

The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.

Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics

Procedia PDF Downloads 504
7259 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study

Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar

Abstract:

Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.

Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices

Procedia PDF Downloads 497
7258 System and Method for Providing Web-Based Remote Application Service

Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang

Abstract:

With the development of virtualization technologies, a new type of service named cloud computing service is produced. Cloud users usually encounter the problem of how to use the virtualized platform easily over the web without requiring the plug-in or installation of special software. The object of this paper is to develop a system and a method enabling process interfacing within an automation scenario for accessing remote application by using the web browser. To meet this challenge, we have devised a web-based interface that system has allowed to shift the GUI application from the traditional local environment to the cloud platform, which is stored on the remote virtual machine. We designed the sketch of web interface following the cloud virtualization concept that sought to enable communication and collaboration among users. We describe the design requirements of remote application technology and present implementation details of the web application and its associated components. We conclude that this effort has the potential to provide an elastic and resilience environment for several application services. Users no longer have to burden the system maintenances and reduce the overall cost of software licenses and hardware. Moreover, this remote application service represents the next step to the mobile workplace, and it lets user to use the remote application virtually from anywhere.

Keywords: virtualization technology, virtualized platform, web interface, remote application

Procedia PDF Downloads 281
7257 A Paradigm Shift towards Personalized and Scalable Product Development and Lifecycle Management Systems in the Aerospace Industry

Authors: David E. Culler, Noah D. Anderson

Abstract:

Integrated systems for product design, manufacturing, and lifecycle management are difficult to implement and customize. Commercial software vendors, including CAD/CAM and third party PDM/PLM developers, create user interfaces and functionality that allow their products to be applied across many industries. The result is that systems become overloaded with functionality, difficult to navigate, and use terminology that is unfamiliar to engineers and production personnel. For example, manufacturers of automotive, aeronautical, electronics, and household products use similar but distinct methods and processes. Furthermore, each company tends to have their own preferred tools and programs for controlling work and information flow and that connect design, planning, and manufacturing processes to business applications. This paper presents a methodology and a case study that addresses these issues and suggests that in the future more companies will develop personalized applications that fit to the natural way that their business operates. A functioning system has been implemented at a highly competitive U.S. aerospace tooling and component supplier that works with many prominent airline manufacturers around the world including The Boeing Company, Airbus, Embraer, and Bombardier Aerospace. During the last three years, the program has produced significant benefits such as the automatic creation and management of component and assembly designs (parametric models and drawings), the extensive use of lightweight 3D data, and changes to the way projects are executed from beginning to end. CATIA (CAD/CAE/CAM) and a variety of programs developed in C#, VB.Net, HTML, and SQL make up the current system. The web-based platform is facilitating collaborative work across multiple sites around the world and improving communications with customers and suppliers. This work demonstrates that the creative use of Application Programming Interface (API) utilities, libraries, and methods is a key to automating many time-consuming tasks and linking applications together.

Keywords: PDM, PLM, collaboration, CAD/CAM, scalable systems

Procedia PDF Downloads 169
7256 Decision Analysis Module for Excel

Authors: Radomir Perzina, Jaroslav Ramik

Abstract:

The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.

Keywords: analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, scenarios

Procedia PDF Downloads 444
7255 Framework for Decision Support Tool for Quality Control and Management in Botswana Manufacturing Companies

Authors: Mogale Sabone, Thabiso Ntlole

Abstract:

The pressure from globalization has made manufacturing organizations to move towards three major competitive arenas: quality, cost, and responsiveness. Quality is a universal value and has become a global issue. In order to survive and be able to provide customers with good products, manufacturing organizations’ supporting systems, tools, and structures it uses must grow or evolve. The majority of quality management concepts and strategies that are practiced recently are aimed at detecting and correcting problems which already exist and serve to limit losses. In agile manufacturing environment there is no room for defect and error so it needs a quality management which is proactively directed at problem prevention. This proactive quality management avoids losses by focusing on failure prevention, virtual elimination of the possibility of premature failure, mistake-proofing, and assuring consistently high quality in the definition and design of creation processes. To achieve this, a decision support tool for quality control and management is suggested. Current decision support tools/methods used by most manufacturing companies in Botswana for quality management and control are not integrated, for example they are not consistent since some tests results data is recorded manually only whilst others are recorded electronically. It is only a set of procedures not a tool. These procedures cannot offer interactive decision support. This point brings to light the aim of this research which is to develop a framework which will help manufacturing companies in Botswana build a decision support tool for quality control and management.

Keywords: decision support tool, manufacturing, quality control, quality management

Procedia PDF Downloads 560
7254 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base

Authors: Omid Khodabakhshi, Amir Rozdel

Abstract:

The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.

Keywords: code, cloud computing, security, virtual machines

Procedia PDF Downloads 184
7253 Exploration of RFID in Healthcare: A Data Mining Approach

Authors: Shilpa Balan

Abstract:

Radio Frequency Identification, also popularly known as RFID is used to automatically identify and track tags attached to items. This study focuses on the application of RFID in healthcare. The adoption of RFID in healthcare is a crucial technology to patient safety and inventory management. Data from RFID tags are used to identify the locations of patients and inventory in real time. Medical errors are thought to be a prominent cause of loss of life and injury. The major advantage of RFID application in healthcare industry is the reduction of medical errors. The healthcare industry has generated huge amounts of data. By discovering patterns and trends within the data, big data analytics can help improve patient care and lower healthcare costs. The number of increasing research publications leading to innovations in RFID applications shows the importance of this technology. This study explores the current state of research of RFID in healthcare using a text mining approach. No study has been performed yet on examining the current state of RFID research in healthcare using a data mining approach. In this study, related articles were collected on RFID from healthcare journal and news articles. Articles collected were from the year 2000 to 2015. Significant keywords on the topic of focus are identified and analyzed using open source data analytics software such as Rapid Miner. These analytical tools help extract pertinent information from massive volumes of data. It is seen that the main benefits of adopting RFID technology in healthcare include tracking medicines and equipment, upholding patient safety, and security improvement. The real-time tracking features of RFID allows for enhanced supply chain management. By productively using big data, healthcare organizations can gain significant benefits. Big data analytics in healthcare enables improved decisions by extracting insights from large volumes of data.

Keywords: RFID, data mining, data analysis, healthcare

Procedia PDF Downloads 227
7252 Smart Trust Management for Vehicular Networks

Authors: Amel Ltifi, Ahmed Zouinkhi, Med Salim Bouhlel

Abstract:

Spontaneous networks such as VANET are in general deployed in an open and thus easily accessible environment. Therefore, they are vulnerable to attacks. Trust management is one of a set of security solutions dedicated to this type of networks. Moreover, the strong mobility of the nodes (in the case of VANET) makes the establishment of a trust management system complex. In this paper, we present a concept of ‘Active Vehicle’ which means an autonomous vehicle that is able to make decision about trustworthiness of alert messages transmitted about road accidents. The behavior of an “Active Vehicle” is modeled using Petri Nets.

Keywords: active vehicle, cooperation, petri nets, trust management, VANET

Procedia PDF Downloads 393
7251 Evaluation of the Notifiable Diseases Surveillance System, South, Haiti, 2022

Authors: Djeamsly Salomon

Abstract:

Background: Epidemiological surveillance is a dynamic national system used to observe all aspects of the evolution of priority health problems, through: collection, analysis, systematic interpretation of information, and dissemination of results with necessary recommendations. The study was conducted to assess the mandatory disease surveillance system in the Sud Department. Methods: A study was conducted from March to May 2021 with key players involved in surveillance at the level of health institutions in the department . The CDC's 2021 updated guideline was used to evaluate the system. We collected information about the operation, attributes, and usefulness of the surveillance system using interviewer-administered questionnaires. Epi-Info7.2 and Excel 2016 were used to generate the mean, frequencies and proportions. Results: Of 30 participants, 23 (77%) were women. The average age was 39 years[30-56]. 25 (83%) had training in epidemiological surveillance. (50%) of the forms checked were signed by the supervisor. Collection tools were available at (80%). Knowledge of at least 7 notifiable diseases was high (100%). Among the respondents, 29 declared that the collection tools were simple, 27 had already filled in a notification form. The maximum time taken to fill out a form was 10 minutes. The feedback between the different levels was done at (60%). Conclusion: The surveillance system is useful, simple, acceptable, representative, flexible, stable and responsive. The data generated was of high quality. However, it is threatened by the lack of supervision of sentinel sites, lack of investigation and weak feedback. This evaluation demonstrated the urgent need to improve supervision in the sites and to feedback information. Strengthen epidemiological surveillance.

Keywords: evaluation, notifiable diseases, surveillance, system

Procedia PDF Downloads 73
7250 Design and Implementation of Remote Application Virtualization in Cloud Environments

Authors: Shuen-Tai Wang, Ying-Chuan Chen, Hsi-Ya Chang

Abstract:

Cloud computing is a paradigm of computing that shifts the way computing has been done in the past. The users can use cloud resources such as application software or storage space from the cloud without needing to own them. This paper is focused on solutions that are anticipated to introduce IaaS idea to build cloud base services and enable the individual remote user's applications in cloud environments, which appear as if they are running on the end user's local computer. The available features of application delivery solution have been developed based on our previous research on the virtualization technology to offer applications independent of location so that the users can work online, offline, anywhere, with appropriate device and at any time. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for cloud service. Users no longer need to burden the system managers and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote application virtualization service represents the next significant step to the mobile workplace, and it lets users access their applications remotely through cloud services anywhere. This is also made possible by the low administrative costs as well as relatively inexpensive end-user terminals and reduced energy expenses.

Keywords: cloud computing, IaaS, virtualization, application delivery

Procedia PDF Downloads 274
7249 An Investigative Study on the Use of Online Marketing Methods in Hungary

Authors: E. Happ, Zs. Ivancsone Horvath

Abstract:

With the development of the information technology, IT, sector, all industry of the world has a new path, dealing with digitalisation. Tourism is the most rapidly increasing industry in the world. Without digitalisation, tourism operators would not be competitive enough with foreign destinations or other experience-based service providers. Digitalisation is also necessary to enable organizations, which are interested in tourism to meet the growing expectations of consumers. With the help of digitalisation, tourism providers can also obtain information about tourists, changes in consumer behaviour, and the use of online services. The degree of digitalisation in tourism is different for different services. The research is based on a questionnaire survey conducted in 2018 in Hungary. The sample with more than 500 respondents was processed by the SPSS program, using a variety of analysis methods. The following two variables were observed from more aspects: frequency of travel and the importance of services related to online travel. With the help of these variables, a cluster analysis was performed among the participants. The sample can be divided into two groups using K-mean cluster analysis. Cluster ‘1’ is a positive group; they can be called the “most digital tourists.” They agree in most things, with low standard deviation, and for them, digitalisation is a starting point. To the members of Cluster ‘2’, digitalisation is important, too. The results show what is important (accommodation, information gathering) to them, but also what they are not interested in at all within the digital world (e.g., car rental or online sharing). Interestingly, there is no third negative cluster. This result (that there is no result) proves that tourism uses digitalisation, and the question is only the extent of the use of online tools and methods. With the help of the designed consumer groups, the characteristics of digital tourism segments can be identified. The help of different variables characterised these groups. One of them is the frequency of travel, where there is a significant correlation between travel frequency and cluster membership. The shift is clear towards Cluster ‘1’, which means, those who find services related to online travel more important, are more likely to travel as well. By learning more about digital tourists’ consumer behaviour, the results of this research can help the providers in what kind of marketing tools could be used to influence the consumer choices of the different consumer groups created using digital devices, furthermore how to conduct more detailed and effective marketing activities. The main finding of the research was that most of the people have digital tools which are important to be able to participate in e-tourism. Of these, mobile devices are increasingly preferred. That means the challenge for service providers is no longer the digital presence but having optimised application for different devices.

Keywords: cluster analysis, digital tourism, marketing tool, tourist behaviour

Procedia PDF Downloads 122
7248 Assessing the Effect of Underground Tunnel Diameter on Structure-Foundation-Soil Performance under the Kobe Earthquake

Authors: Masoud Mahdavi

Abstract:

Today, developed and industrial cities have all kinds of sewage and water transfer canals, subway tunnels, infrastructure facilities, etc., which have caused underground cavities to be created under the buildings. The presence of these cavities causes behavioral changes in the structural behavior that must be fully evaluated. In the present study, using Abaqus finite element software, the effect of cavities with 0.5 and 1.5 meters in diameter at a depth of 2.5 meters from the earth's surface (with a circular cross-section) on the performance of the foundation and the ground (soil) has been evaluated. For this purpose, the Kobe earthquake was applied to the models for 10 seconds. Also, pore water pressure and weight were considered on the models to get complete results. The results showed that by creating and increasing the diameter of circular cavities in the soil, three indicators; 1) von Mises stress, 2) displacement and 3) plastic strain have had oscillating, ascending and ascending processes, respectively, which shows the relationship between increasing the diameter index of underground cavities and structural indicators of structure-foundation-soil.

Keywords: underground excavations, foundation, structural substrates, Abaqus software, Kobe earthquake, time history analysis

Procedia PDF Downloads 112
7247 Specification of Requirements to Ensure Proper Implementation of Security Policies in Cloud-Based Multi-Tenant Systems

Authors: Rebecca Zahra, Joseph G. Vella, Ernest Cachia

Abstract:

The notion of cloud computing is rapidly gaining ground in the IT industry and is appealing mostly due to making computing more adaptable and expedient whilst diminishing the total cost of ownership. This paper focuses on the software as a service (SaaS) architecture of cloud computing which is used for the outsourcing of databases with their associated business processes. One approach for offering SaaS is basing the system’s architecture on multi-tenancy. Multi-tenancy allows multiple tenants (users) to make use of the same single application instance. Their requests and configurations might then differ according to specific requirements met through tenant customisation through the software. Despite the known advantages, companies still feel uneasy to opt for the multi-tenancy with data security being a principle concern. The fact that multiple tenants, possibly competitors, would have their data located on the same server process and share the same database tables heighten the fear of unauthorised access. Security is a vital aspect which needs to be considered by application developers, database administrators, data owners and end users. This is further complicated in cloud-based multi-tenant system where boundaries must be established between tenants and additional access control models must be in place to prevent unauthorised cross-tenant access to data. Moreover, when altering the database state, the transactions need to strictly adhere to the tenant’s known business processes. This paper focuses on the fact that security in cloud databases should not be considered as an isolated issue. Rather it should be included in the initial phases of the database design and monitored continuously throughout the whole development process. This paper aims to identify a number of the most common security risks and threats specifically in the area of multi-tenant cloud systems. Issues and bottlenecks relating to security risks in cloud databases are surveyed. Some techniques which might be utilised to overcome them are then listed and evaluated. After a description and evaluation of the main security threats, this paper produces a list of software requirements to ensure that proper security policies are implemented by a software development team when designing and implementing a multi-tenant based SaaS. This would then assist the cloud service providers to define, implement, and manage security policies as per tenant customisation requirements whilst assuring security for the customers’ data.

Keywords: cloud computing, data management, multi-tenancy, requirements, security

Procedia PDF Downloads 148
7246 Using Probe Person Data for Travel Mode Detection

Authors: Muhammad Awais Shafique, Eiji Hato, Hideki Yaginuma

Abstract:

Recently GPS data is used in a lot of studies to automatically reconstruct travel patterns for trip survey. The aim is to minimize the use of questionnaire surveys and travel diaries so as to reduce their negative effects. In this paper data acquired from GPS and accelerometer embedded in smart phones is utilized to predict the mode of transportation used by the phone carrier. For prediction, Support Vector Machine (SVM) and Adaptive boosting (AdaBoost) are employed. Moreover a unique method to improve the prediction results from these algorithms is also proposed. Results suggest that the prediction accuracy of AdaBoost after improvement is relatively better than the rest.

Keywords: accelerometer, AdaBoost, GPS, mode prediction, support vector machine

Procedia PDF Downloads 352
7245 Assertion-Driven Test Repair Based on Priority Criteria

Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang

Abstract:

Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent preservation has been proposed, but it does not take into account the association between test repairs and assertions, leading to a large number of irrelevant candidates and decreasing the repair capability. This paper proposes an assertion-driven test repair approach. Furthermore, an intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) of broken test cases, which is more effective than the existing intentpreserved test repair approach, and our intent-oriented priority criteria work well.

Keywords: test repair, test intent, software test, test case evolution

Procedia PDF Downloads 121
7244 Cross-Cultural Collaboration Shaping Co-Creation Methodology to Enhance Disaster Risk Management Approaches

Authors: Jeannette Anniés, Panagiotis Michalis, Chrysoula Papathanasiou, Selby Knudsen

Abstract:

RiskPACC project aims to bring together researchers, practitioners, and first responders from nine European countries following a co-creation approach aiming to develop customised solutions to meet the needs of end-users. The co-creation workshops target to enhance the communication pathways between local civil protection authorities (CPAs) and citizens, in an effort to close the risk perception-action gap (RPAG). The participants in the workshops include a variety of stakeholders, as well as citizens, fostering the dialogue between the groups and supporting citizen participation in disaster risk management (DRM). The co-creation methodology in place implements co-design elements due to the integration of four ICT tools. Such ICT tools include web-based and mobile application technical solutions in different development stages, ranging from formulation and validation of concepts to pilot demonstrations. In total, seven different case studies are foreseen in RiskPACC. The workflow of the workshops is designed to be adaptive to every of the seven case study countries and their cultures’ particular needs. This work aims to provide an overview of the the preparation and the conduction of the workshops in which researchers and practitioners focused on mapping these different needs from the end users. The latter included first responders but also volunteers and citizens who actively participated in the co-creation workshops. The strategies to improve communication between CPAs and citizens themselves differ in the countries, and the modules of the co-creation methodology are adapted in response to such differences. Moreover, the project partners experienced how the structure of such workshops is perceived differently in the seven case studies. Therefore, the co-creation methodology itself is a design method underlying several iterations, which are eventually shaped by cross-cultural collaboration. For example, some case studies applied other modules according to the participatory group recruited. The participants were technical experts, teachers, citizens, first responders, or volunteers, among others. This work aspires to present the divergent approaches of the seven case studies implementing the co-creation methodology proposed, in response to different perceptions of the modules. An analysis of the adaptations and implications will also be provided to assess where the case studies’ objective of improving disaster resilience has been obtained.

Keywords: citizen participation, co-creation, disaster resilience, risk perception, ICT tools

Procedia PDF Downloads 71