Search results for: adaptive user interfaces
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3362

Search results for: adaptive user interfaces

452 The Singapore Innovation Web and Facilitation of Knowledge Processes

Authors: Ola Jon Mork, Irina Emily Hansen

Abstract:

The European Growth Strategy Program calls for more efficient methods for knowledge creation and innovation. This study contributes with new insights into the Singapore Innovation System; more precisely how knowledge processes are facilitated. The research material is collected by visiting the different innovation locations in Singapore and depth interview with key persons. The different innovation actors web sites and brochures have been studied. Governmental reports and figures have also been studied. The findings show that facilitation of Knowledge Processes in the Singapore Innovation System has a basic structure with three processes, which is 1) Idea capturing – 2)Technology and Business Execution – 3)Idea Realization. Dedicated innovation parks work with the most promising entrepreneurs; more precisely: finding the persons with the motivation to 'change the world'. The innovation park will facilitate these entrepreneurs for 100 days, where they also will be connected to a global network of venture capital. And, the entrepreneurs will have access to mentors from these venture companies. Research institutes parks work with the development of world leading technology. To facilitate knowledge development they connect with industrial companies which are the most promising applicators of their technology. Knowledge facilitation is the main purpose, but this cooperation/testing is also serving as a platform for funding. Probably this is cooperation is also attractive for world leading companies. Dedicated innovation parks work with facilitation of innovators of new applications and perfection of products for the end- user. These parks can be specialized in special areas, like health products and life science products. Another example of this is automotive companies giving research call for these parks to develop and innovate new products and services upon their technology. Common characteristics for the knowledge facilitation in the Singapore Innovation System are a short trial period for promising actors, normally 100 days. It is also a strong focus on training of the entrepreneurs. Presentations and diffusion of knowledge is an important part of the facilitation. Funding will be available for the most successful entrepreneurs and innovators.

Keywords: knowledge processes, facilitation, innovation, Singapore innovation web

Procedia PDF Downloads 274
451 A 3D Numerical Environmental Modeling Approach For Assessing Transport of Spilled Oil in Porous Beach Conditions under a Meso-Scale Tank Design

Authors: J. X. Dong, C. J. An, Z. Chen, E. H. Owens, M. C. Boufadel, E. Taylor, K. Lee

Abstract:

Shorelines are vulnerable to significant environmental impacts from oil spills. Stranded oil can cause potential short- to long-term detrimental effects along beaches that include injuries to the ecosystem, socio-economic and cultural resources. In this study, a three-dimensional (3D) numerical modeling approach is developed to evaluate the fate and transport of spilled oil for hypothetical oiled shoreline cases under various combinations of beach geomorphology and environmental conditions. The developed model estimates the spatial and temporal distribution of spilled oil for the various test conditions, using the finite volume method and considering the physical transport (dispersion and advection), sinks, and sorption processes. The model includes a user-friendly interface for data input on variables such as beach properties, environmental conditions, and physical-chemical properties of spilled oil. An experimental mesoscale tank design was used to test the developed model for dissolved petroleum hydrocarbon within shorelines. The simulated results for effects of different sediment substrates, oil types, and shoreline features for the transport of spilled oil are comparable to those obtained with a commercially available model. Results show that the properties of substrates and the oil removal by shoreline effects have significant impacts on oil transport in the beach area. Sensitivity analysis, through the application of the one-step-at-a-time method (OAT), for the 3D model identified hydraulic conductivity as the most sensitive parameter. The 3D numerical model allows users to examine the behavior of oil on and within beaches, assess potential environmental impacts, and provide technical support for decisions related to shoreline clean-up operations.

Keywords: dissolved petroleum hydrocarbons, environmental multimedia model, finite volume method, sensitivity analysis, total petroleum hydrocarbons

Procedia PDF Downloads 191
450 Designing Function Knitted and Woven Upholstery Textile With SCOPY Film

Authors: Manar Y. Abd El-Aziz, Alyaa E. Morgham, Amira A. El-Fallal, Heba Tolla E. Abo El Naga

Abstract:

Different textile materials are usually used in upholstery. However, upholstery parts may become unhealthy when dust accrues and bacteria raise on the surface, which negatively affects the user's health. Also, leather and artificial leather were used in upholstery but, leather has a high cost and artificial leather has a potential chemical risk for users. Researchers have advanced vegie leather made from bacterial cellulose a symbiotic culture of bacteria and yeast (SCOBY). SCOBY remains a gelatinous, cellulose biofilm discovered floating at the air-liquid interface of the container. But this leather still needs some enhancement for its mechanical properties. This study aimed to prepare SCOBY, produce bamboo rib knitted fabrics with two different stitch densities, and cotton woven fabric then laminate these fabrics with the prepared SCOBY film to enhance the mechanical properties of the SCOBY leather at the same time; add anti-microbial function to the prepared fabrics. Laboratory tests were conducted on the produced samples, including tests for function properties; anti-microbial, thermal conductivity and light transparency. Physical properties; thickness and mass per unit. Mechanical properties; elongation, tensile strength, young modulus, and peel force. The results showed that the type of the fabric affected significantly SCOBY properties. According to the test results, the bamboo knitted fabric with higher stitch density laminated with SCOBY was chosen for its tensile strength and elongation as the upholstery of a bed model with antimicrobial properties and comfortability in the headrest design. Also, the single layer of SCOBY was chosen regarding light transparency and lower thermal conductivity for the creation of a lighting unit built into the bed headboard.

Keywords: anti-microbial, bamboo, rib, SCOPY, upholstery

Procedia PDF Downloads 47
449 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company

Authors: Rahma Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS

Procedia PDF Downloads 178
448 Project Marayum: Creating a Community Built Mobile Phone Based, Online Web Dictionary for Endangered Philippine Languages

Authors: Samantha Jade Sadural, Kathleen Gay Figueroa, Noel Nicanor Sison II, Francis Miguel Quilab, Samuel Edric Solis, Kiel Gonzales, Alain Andrew Boquiren, Janelle Tan, Mario Carreon

Abstract:

Of the 185 languages in the Philippines, 28 are endangered, 11 are dying off, and 4 are extinct. Language documentation, as a prerequisite to language education, can be one of the ways languages can be preserved. Project Marayum is envisioned to be a collaboratively built, mobile phone-based, online dictionary platform for Philippine languages. Although there are many online language dictionaries available on the Internet, Project Marayum aims to give a sense of ownership to the language community's dictionary as it is built and maintained by the community for the community. From a seed dictionary, members of a language community can suggest changes, add new entries, and provide language examples. Going beyond word definitions, the platform can be used to gather sample sentences and even audio samples of word usage. These changes are reviewed by language experts of the community, sourced from the local state universities or local government units. Approved changes are then added to the dictionary and can be viewed instantly through the Marayum website. A companion mobile phone application allows users to browse the dictionary in remote areas where Internet connectivity is nonexistent. The dictionary will automatically be updated once the user regains Internet access. Project Marayum is still a work in progress. At the time of this abstract's writing, the Project has just entered its second year. Prototypes are currently being tested with the Asi language of Romblon island as its initial language testbed. In October 2020, Project Marayum will have both a webpage and mobile application with Asi, Ilocano, and Cebuano language dictionaries available for use online or for download. In addition, the Marayum platform would be then easily expandable for use of the more endangered language communities. Project Marayum is funded by the Philippines Department of Science and Technology.

Keywords: collaborative language dictionary, community-centered lexicography, content management system, software engineering

Procedia PDF Downloads 139
447 A Temporal QoS Ontology For ERTMS/ETCS

Authors: Marc Sango, Olimpia Hoinaru, Christophe Gransart, Laurence Duchien

Abstract:

Ontologies offer a means for representing and sharing information in many domains, particularly in complex domains. For example, it can be used for representing and sharing information of System Requirement Specification (SRS) of complex systems like the SRS of ERTMS/ETCS written in natural language. Since this system is a real-time and critical system, generic ontologies, such as OWL and generic ERTMS ontologies provide minimal support for modeling temporal information omnipresent in these SRS documents. To support the modeling of temporal information, one of the challenges is to enable representation of dynamic features evolving in time within a generic ontology with a minimal redesign of it. The separation of temporal information from other information can help to predict system runtime operation and to properly design and implement them. In addition, it is helpful to provide a reasoning and querying techniques to reason and query temporal information represented in the ontology in order to detect potential temporal inconsistencies. Indeed, a user operation, such as adding a new constraint on existing planning constraints can cause temporal inconsistencies, which can lead to system failures. To address this challenge, we propose a lightweight 3-layer temporal Quality of Service (QoS) ontology for representing, reasoning and querying over temporal and non-temporal information in a complex domain ontology. Representing QoS entities in separated layers can clarify the distinction between the non QoS entities and the QoS entities in an ontology. The upper generic layer of the proposed ontology provides an intuitive knowledge of domain components, specially ERTMS/ETCS components. The separation of the intermediate QoS layer from the lower QoS layer allows us to focus on specific QoS Characteristics, such as temporal or integrity characteristics. In this paper, we focus on temporal information that can be used to predict system runtime operation. To evaluate our approach, an example of the proposed domain ontology for handover operation, as well as a reasoning rule over temporal relations in this domain-specific ontology, are given.

Keywords: system requirement specification, ERTMS/ETCS, temporal ontologies, domain ontologies

Procedia PDF Downloads 389
446 Heroic Villains: An Exploration of the Use of Narrative Plotlines and Emerging Identities within Recovery Stories of Former Substance Abusers

Authors: Tria Moore Aimee Walker-Clarke

Abstract:

The purpose of the study was to develop a deeper understanding of how self-identity is negotiated and reconstructed by people in recovery from substance abuse. The approach draws on the notion that self-identity is constructed through stories. Specifically, dominant narratives of substance abuse involve the 'addict identity' in which the meaning of being an addict is constructed though social interaction and informed by broader social meanings of substance misuse, which are considered deviant. The addict is typically understood as out of control, weak and feckless. Users may unconsciously embody this addict identity which makes recovery less likely. Typical approaches to treatment employ the notion that recovery is much more likely when users change the way they think and feel about themselves by assembling a new identity. Recovery, therefore, involves a reconstruction of the self in a new light, which may mean rejecting a part of the self (the addict identity). One limitation is that previous research on this topic has been quantitative which, while useful, tells us little about how this process is best managed. Should one, for example, reject the past addict identity completely and move on to the new identity, or, is it more effective to accept the past identity and use this in the formation of the new non-user identity? The purpose of this research, then, is to explore how addicts in recovery have managed the transition between their past and current selves and whether this may inform therapeutic practice. Using a narrative approach, data were analyzed from five in-depth interviews with former addicts who had been abstinent for at least a year, and who were in some form of volunteering role at substance treatment services in the UK. Although participants' identified with a previous ‘addict identity,’ and made efforts to disassociate themselves from this, they also recognized that acceptance was an important part of reconstructing their new identity. The participants' narratives used familiar plot lines to structure their stories, in which they positioned themselves as the heroes in their own stories, rather than as victim of circumstance. Instead of rejecting their former addict identity, which would mean rejecting a part of the self, participants used their experience in a reconstructive and restorative way. The findings suggest that encouraging people to tell their story and accept their addict identity are important factors in successful recovery.

Keywords: addiction, identity, narrative, recovery, substance abuse

Procedia PDF Downloads 284
445 Free and Open Source Software for BIM Workflow of Steel Structure Design

Authors: Danilo Di Donato

Abstract:

The continuous new releases of free and open source software (FOSS) and the high costs of proprietary software -whose monopoly is characterized by closed codes and the low level of implementation and customization of software by end-users- impose a reflection on possible tools that can be chosen and adopted for the design and the representation of new steel constructions. The paper aims to show experimentation carried out to verify the actual potential and the effective applicability of FOSS supports to the BIM modeling of steel structures, particularly considering the goal of a possible workflow in order to achieve high level of development (LOD); allow effective interchange methods between different software. To this end, the examined software packages are those with open source or freeware licenses, in order to evaluate their use in architectural praxis. The test has primarily involved the experimentation of Freecad -the only Open Source software that allows a complete and integrated BIM workflow- and then the results have been compared with those of two proprietary software, Sketchup and TeklaBim Sight, which are released with a free version, but not usable for commercial purposes. The experiments carried out on Open Source, and freeware software was then compared with the outcomes that are obtained by two proprietary software, Sketchup Pro and Tekla Structure which has special modules particularly addressed to the design of steel structures. This evaluation has concerned different comparative criteria, that have been defined on the basis of categories related to the reliability, the efficiency, the potentiality, achievable LOD and user-friendliness of the analyzed software packages. In order to verify the actual outcomes of FOSS BIM for the steel structure projects, these results have been compared with a simulation related to a real case study and carried out with a proprietary software BIM modeling. Therefore, the same design theme, the project of a shelter of public space, has been developed using different software. Therefore the purpose of the contribution is to assess what are the developments and potentialities inherent in FOSS BIM, in order to estimate their effective applicability to professional practice, their limits and new fields of research they propose.

Keywords: BIM, steel buildings, FOSS, LOD

Procedia PDF Downloads 152
444 Contrasting Infrastructure Sharing and Resource Substitution Synergies Business Models

Authors: Robin Molinier

Abstract:

Industrial symbiosis (I.S) rely on two modes of cooperation that are infrastructure sharing and resource substitution to obtain economic and environmental benefits. The former consists in the intensification of use of an asset while the latter is based on the use of waste, fatal energy (and utilities) as alternatives to standard inputs. Both modes, in fact, rely on the shift from a business-as-usual functioning towards an alternative production system structure so that in a business point of view the distinction is not clear. In order to investigate the way those cooperation modes can be distinguished, we consider the stakeholders' interplay in the business model structure regarding their resources and requirements. For infrastructure sharing (following economic engineering literature) the cost function of capacity induces economies of scale so that demand pooling reduces global expanses. Grassroot investment sizing decision and the ex-post pricing strongly depends on the design optimization phase for capacity sizing whereas ex-post operational cost sharing minimizing budgets are less dependent upon production rates. Value is then mainly design driven. For resource substitution, synergies value stems from availability and is at risk regarding both supplier and user load profiles and market prices of the standard input. Baseline input purchasing cost reduction is thus more driven by the operational phase of the symbiosis and must be analyzed within the whole sourcing policy (including diversification strategies and expensive back-up replacement). Moreover, while resource substitution involves a chain of intermediate processors to match quality requirements, the infrastructure model relies on a single operator whose competencies allow to produce non-rival goods. Transaction costs appear higher in resource substitution synergies due to the high level of customization which induces asset specificity, and non-homogeneity following transaction costs economics arguments.

Keywords: business model, capacity, sourcing, synergies

Procedia PDF Downloads 150
443 Toward Indoor and Outdoor Surveillance using an Improved Fast Background Subtraction Algorithm

Authors: El Harraj Abdeslam, Raissouni Naoufal

Abstract:

The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes in variance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.

Keywords: video surveillance, background subtraction, contrast limited histogram equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes

Procedia PDF Downloads 236
442 Evaluation of Traffic Noise Level: A Case Study in Residential Area of Ishbiliyah , Kuwait

Authors: Jamal Almatawah, Hamad Matar, Abdulsalam Altemeemi

Abstract:

The World Health Organization (WHO) has recognized environmental noise as harmful pollution that causes adverse psychosocial and physiologic effects on human health. The motor vehicle is considered to be one of the main source of noise pollution. It is a universal phenomenon, and it has grown to the point that it has become a major concern for both the public and policymakers. The aim of this paper, therefore, is to investigate the Traffic noise levels and the contributing factors that affect its level, such as traffic volume, heavy-vehicle Speed and other metrological factors in Ishbiliyah as a sample of a residential area in Kuwait. Three types of roads were selected in Ishbiliyah expressway, major arterial and collector street. The other source of noise that interferes the traffic noise has also been considered in this study. Traffic noise level is measured and analyzed using the Bruel & Kjaer outdoor sound level meter 2250-L (2250 Light). The Count-Cam2 Video Camera has been used to collect the peak and off-peak traffic count. Ambient Weather WM-5 Handheld Weather Station is used for metrological factors such as temperature, humidity and wind speed. Also, the spot speed was obtained using the radar speed: Decatur Genesis model GHD-KPH. All the measurement has been detected at the same time (simultaneously). The results showed that the traffic noise level is over the allowable limit on all types of roads. The average equivalent noise level (LAeq) for the Expressway, Major arterial and Collector Street was 74.3 dB(A), 70.47 dB(A) and 60.84 dB(A), respectively. In addition, a Positive Correlation coefficient between the traffic noise versus traffic volume and between traffic noise versus 85th percentile speed was obtained. However, there was no significant relation and Metrological factors. Abnormal vehicle noise due to poor maintenance or user-enhanced exhaust noise was found to be one of the highest factors that affected the overall traffic noise reading.

Keywords: traffic noise, residential area, pollution, vehicle noise

Procedia PDF Downloads 40
441 Urban Corridor Management Strategy Based on Intelligent Transportation System

Authors: Sourabh Jain, Sukhvir Singh Jain, Gaurav V. Jain

Abstract:

Intelligent Transportation System (ITS) is the application of technology for developing a user–friendly transportation system for urban areas in developing countries. The goal of urban corridor management using ITS in road transport is to achieve improvements in mobility, safety, and the productivity of the transportation system within the available facilities through the integrated application of advanced monitoring, communications, computer, display, and control process technologies, both in the vehicle and on the road. This paper attempts to present the past studies regarding several ITS available that have been successfully deployed in urban corridors of India and abroad, and to know about the current scenario and the methodology considered for planning, design, and operation of Traffic Management Systems. This paper also presents the endeavor that was made to interpret and figure out the performance of the 27.4 Km long study corridor having eight intersections and four flyovers. The corridor consisting of 6 lanes as well as 8 lanes divided road network. Two categories of data were collected on February 2016 such as traffic data (traffic volume, spot speed, delay) and road characteristics data (no. of lanes, lane width, bus stops, mid-block sections, intersections, flyovers). The instruments used for collecting the data were video camera, radar gun, mobile GPS and stopwatch. From analysis, the performance interpretations incorporated were identification of peak hours and off peak hours, congestion and level of service (LOS) at mid blocks, delay followed by the plotting speed contours and recommending urban corridor management strategies. From the analysis, it is found that ITS based urban corridor management strategies will be useful to reduce congestion, fuel consumption and pollution so as to provide comfort and efficiency to the users. The paper presented urban corridor management strategies based on sensors incorporated in both vehicles and on the roads.

Keywords: congestion, ITS strategies, mobility, safety

Procedia PDF Downloads 421
440 A Location-Based Search Approach According to Users’ Application Scenario

Authors: Shih-Ting Yang, Chih-Yun Lin, Ming-Yu Li, Jhong-Ting Syue, Wei-Ming Huang

Abstract:

Global positioning system (GPS) has become increasing precise in recent years, and the location-based service (LBS) has developed rapidly. Take the example of finding a parking lot (such as Parking apps). The location-based service can offer immediate information about a nearby parking lot, including the information about remaining parking spaces. However, it cannot provide expected search results according to the requirement situations of users. For that reason, this paper develops a “Location-based Search Approach according to Users’ Application Scenario” according to the location-based search and demand determination to help users obtain the information consistent with their requirements. The “Location-based Search Approach based on Users’ Application Scenario” of this paper consists of one mechanism and three kernel modules. First, in the Information Pre-processing Mechanism (IPM), this paper uses the cosine theorem to categorize the locations of users. Then, in the Information Category Evaluation Module (ICEM), the kNN (k-Nearest Neighbor) is employed to classify the browsing records of users. After that, in the Information Volume Level Determination Module (IVLDM), this paper makes a comparison between the number of users’ clicking the information at different locations and the average number of users’ clicking the information at a specific location, so as to evaluate the urgency of demand; then, the two-dimensional space is used to estimate the application situations of users. For the last step, in the Location-based Search Module (LBSM), this paper compares all search results and the average number of characters of the search results, categorizes the search results with the Manhattan Distance, and selects the results according to the application scenario of users. Additionally, this paper develops a Web-based system according to the methodology to demonstrate practical application of this paper. The application scenario-based estimate and the location-based search are used to evaluate the type and abundance of the information expected by the public at specific location, so that information demanders can obtain the information consistent with their application situations at specific location.

Keywords: data mining, knowledge management, location-based service, user application scenario

Procedia PDF Downloads 96
439 American Sign Language Recognition System

Authors: Rishabh Nagpal, Riya Uchagaonkar, Venkata Naga Narasimha Ashish Mernedi, Ahmed Hambaba

Abstract:

The rapid evolution of technology in the communication sector continually seeks to bridge the gap between different communities, notably between the deaf community and the hearing world. This project develops a comprehensive American Sign Language (ASL) recognition system, leveraging the advanced capabilities of convolutional neural networks (CNNs) and vision transformers (ViTs) to interpret and translate ASL in real-time. The primary objective of this system is to provide an effective communication tool that enables seamless interaction through accurate sign language interpretation. The architecture of the proposed system integrates dual networks -VGG16 for precise spatial feature extraction and vision transformers for contextual understanding of the sign language gestures. The system processes live input, extracting critical features through these sophisticated neural network models, and combines them to enhance gesture recognition accuracy. This integration facilitates a robust understanding of ASL by capturing detailed nuances and broader gesture dynamics. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing diverse ASL signs, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced ASL recognition system and lays the groundwork for future innovations in assistive communication technologies.

Keywords: sign language, computer vision, vision transformer, VGG16, CNN

Procedia PDF Downloads 15
438 Portable System for the Acquisition and Processing of Electrocardiographic Signals to Obtain Different Metrics of Heart Rate Variability

Authors: Daniel F. Bohorquez, Luis M. Agudelo, Henry H. León

Abstract:

Heart rate variability (HRV) is defined as the temporary variation between heartbeats or RR intervals (distance between R waves in an electrocardiographic signal). This distance is currently a recognized biomarker. With the analysis of the distance, it is possible to assess the sympathetic and parasympathetic nervous systems. These systems are responsible for the regulation of the cardiac muscle. The analysis allows health specialists and researchers to diagnose various pathologies based on this variation. For the acquisition and analysis of HRV taken from a cardiac electrical signal, electronic equipment and analysis software that work independently are currently used. This complicates and delays the process of interpretation and diagnosis. With this delay, the health condition of patients can be put at greater risk. This can lead to an untimely treatment. This document presents a single portable device capable of acquiring electrocardiographic signals and calculating a total of 19 HRV metrics. This reduces the time required, resulting in a timelier intervention. The device has an electrocardiographic signal acquisition card attached to a microcontroller capable of transmitting the cardiac signal wirelessly to a mobile device. In addition, a mobile application was designed to analyze the cardiac waveform. The device calculates the RR and different metrics. The application allows a user to visualize in real-time the cardiac signal and the 19 metrics. The information is exported to a cloud database for remote analysis. The study was performed under controlled conditions in the simulated hospital of the Universidad de la Sabana, Colombia. A total of 60 signals were acquired and analyzed. The device was compared against two reference systems. The results show a strong level of correlation (r > 0.95, p < 0.05) between the 19 metrics compared. Therefore, the use of the portable system evaluated in clinical scenarios controlled by medical specialists and researchers is recommended for the evaluation of the condition of the cardiac system.

Keywords: biological signal análisis, heart rate variability (HRV), HRV metrics, mobile app, portable device.

Procedia PDF Downloads 164
437 Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification

Authors: Fathi Kallel, Abdulelah Alabd Uljabbar, Abdulrahman Aldukhail, Abdulaziz Alomran

Abstract:

The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface.

Keywords: MRI, brain tumor, CAD, feature extraction, DWT, PCA, classification, SVM

Procedia PDF Downloads 224
436 Conflict Resolution in Fuzzy Rule Base Systems Using Temporal Modalities Inference

Authors: Nasser S. Shebka

Abstract:

Fuzzy logic is used in complex adaptive systems where classical tools of representing knowledge are unproductive. Nevertheless, the incorporation of fuzzy logic, as it’s the case with all artificial intelligence tools, raised some inconsistencies and limitations in dealing with increased complexity systems and rules that apply to real-life situations and hinders the ability of the inference process of such systems, but it also faces some inconsistencies between inferences generated fuzzy rules of complex or imprecise knowledge-based systems. The use of fuzzy logic enhanced the capability of knowledge representation in such applications that requires fuzzy representation of truth values or similar multi-value constant parameters derived from multi-valued logic, which set the basis for the three t-norms and their based connectives which are actually continuous functions and any other continuous t-norm can be described as an ordinal sum of these three basic ones. However, some of the attempts to solve this dilemma were an alteration to fuzzy logic by means of non-monotonic logic, which is used to deal with the defeasible inference of expert systems reasoning, for example, to allow for inference retraction upon additional data. However, even the introduction of non-monotonic fuzzy reasoning faces a major issue of conflict resolution for which many principles were introduced, such as; the specificity principle and the weakest link principle. The aim of our work is to improve the logical representation and functional modelling of AI systems by presenting a method of resolving existing and potential rule conflicts by representing temporal modalities within defeasible inference rule-based systems. Our paper investigates the possibility of resolving fuzzy rules conflict in a non-monotonic fuzzy reasoning-based system by introducing temporal modalities and Kripke's general weak modal logic operators in order to expand its knowledge representation capabilities by means of flexibility in classifying newly generated rules, and hence, resolving potential conflicts between these fuzzy rules. We were able to address the aforementioned problem of our investigation by restructuring the inference process of the fuzzy rule-based system. This is achieved by using time-branching temporal logic in combination with restricted first-order logic quantifiers, as well as propositional logic to represent classical temporal modality operators. The resulting findings not only enhance the flexibility of complex rule-base systems inference process but contributes to the fundamental methods of building rule bases in such a manner that will allow for a wider range of applicable real-life situations derived from a quantitative and qualitative knowledge representational perspective.

Keywords: fuzzy rule-based systems, fuzzy tense inference, intelligent systems, temporal modalities

Procedia PDF Downloads 66
435 Satellite Derived Evapotranspiration and Turbulent Heat Fluxes Using Surface Energy Balance System (SEBS)

Authors: Muhammad Tayyab Afzal, Muhammad Arslan, Mirza Muhammad Waqar

Abstract:

One of the key components of the water cycle is evapotranspiration (ET), which represents water consumption by vegetated and non-vegetated surfaces. Conventional techniques for measurements of ET are point based and representative of the local scale only. Satellite remote sensing data with large area coverage and high temporal frequency provide representative measurements of several relevant biophysical parameters required for estimation of ET at regional scales. The objective is of this research is to exploit satellite data in order to estimate evapotranspiration. This study uses Surface Energy Balance System (SEBS) model to calculate daily actual evapotranspiration (ETa) in Larkana District, Sindh Pakistan using Landsat TM data for clouds-free days. As there is no flux tower in the study area for direct measurement of latent heat flux or evapotranspiration and sensible heat flux, therefore, the model estimated values of ET were compared with reference evapotranspiration (ETo) computed by FAO-56 Penman Monteith Method using meteorological data. For a country like Pakistan, agriculture by irrigation in the river basins is the largest user of fresh water. For the better assessment and management of irrigation water requirement, the estimation of consumptive use of water for agriculture is very important because it is the main consumer of water. ET is yet an essential issue of water imbalance due to major loss of irrigation water and precipitation on cropland. As large amount of irrigated water is lost through ET, therefore its accurate estimation can be helpful for efficient management of irrigation water. Results of this study can be used to analyse surface conditions, i.e. temperature, energy budgets and relevant characteristics. Through this information we can monitor vegetation health and suitable agricultural conditions and can take controlling steps to increase agriculture production.

Keywords: SEBS, remote sensing, evapotranspiration, ETa

Procedia PDF Downloads 311
434 The Tramway in French Cities: Complication of Public Spaces and Complexity of the Design Process

Authors: Elisa Maître

Abstract:

The redeployment of tram networks in French cities has considerably modified public spaces and the way citizens use them. Above and beyond the image that trams have of contributing to the sustainable urban development, the question of safety for users in these spaces has not been studied much. This study is based on an analysis of use of public spaces laid out for trams, from the standpoint of legibility and safety concerns. The study also examines to what extent the complexity of the design process, with many interactions between numerous and varied players in this process has a role in the genesis of these problems. This work is mainly based on the analysis of links between the uses of these re-designed public spaces (through observations, interviews of users and accident studies) and the analysis of the design conditions and processes of the projects studied (mainly based on interviews with the actors of these projects). Practical analyses were based three points of view: that of the planner, that of the user (based on observations and interviews) and that of the road safety expert. The cities of Montpellier, Marseille and Nice are the three fields of study on which the demonstration of this thesis is based. On part, the results of this study allow showing that the insertion of tram poses some problems complication of public areas of French cities. These complications related to the restructuring of public spaces for the tram, create difficulties of use and safety concerns. On the other hand, interviews depth analyses, fully transcribed, have led us to develop particular dysfunction scenarios in the design process. These elements lead to question the way the legibility and safety of these new forms of public spaces are taken into account. Then, an in-depth analysis of the design processes of public spaces with trams systems would also be a way of better understanding the choices made, the compromises accepted, and the conflicts and constraints at work, weighing on the layout of these spaces. The results presented concerning the impact that spaces laid out for trams have on the difficulty of use, suggest different possibilities for improving the way in which safety for all users is taken into account in designing public spaces.

Keywords: public spaces, road layout, users, design process of urban projects

Procedia PDF Downloads 210
433 Level Set Based Extraction and Update of Lake Contours Using Multi-Temporal Satellite Images

Authors: Yindi Zhao, Yun Zhang, Silu Xia, Lixin Wu

Abstract:

The contours and areas of water surfaces, especially lakes, often change due to natural disasters and construction activities. It is an effective way to extract and update water contours from satellite images using image processing algorithms. However, to produce optimal water surface contours that are close to true boundaries is still a challenging task. This paper compares the performances of three different level set models, including the Chan-Vese (CV) model, the signed pressure force (SPF) model, and the region-scalable fitting (RSF) energy model for extracting lake contours. After experiment testing, it is indicated that the RSF model, in which a region-scalable fitting (RSF) energy functional is defined and incorporated into a variational level set formulation, is superior to CV and SPF, and it can get desirable contour lines when there are “holes” in the regions of waters, such as the islands in the lake. Therefore, the RSF model is applied to extracting lake contours from Landsat satellite images. Four temporal Landsat satellite images of the years of 2000, 2005, 2010, and 2014 are used in our study. All of them were acquired in May, with the same path/row (121/036) covering Xuzhou City, Jiangsu Province, China. Firstly, the near infrared (NIR) band is selected for water extraction. Image registration is conducted on NIR bands of different temporal images for information update, and linear stretching is also done in order to distinguish water from other land cover types. Then for the first temporal image acquired in 2000, lake contours are extracted via the RSF model with initialization of user-defined rectangles. Afterwards, using the lake contours extracted the previous temporal image as the initialized values, lake contours are updated for the current temporal image by means of the RSF model. Meanwhile, the changed and unchanged lakes are also detected. The results show that great changes have taken place in two lakes, i.e. Dalong Lake and Panan Lake, and RSF can actually extract and effectively update lake contours using multi-temporal satellite image.

Keywords: level set model, multi-temporal image, lake contour extraction, contour update

Procedia PDF Downloads 341
432 Role of Community Based Forest Management to Address Climate Change Problem: A Case of Nepalese Community Forestry

Authors: Bikram Jung Kunwar

Abstract:

Forests have central roles in climate change. The conservation of forests sequestrates the carbon from the atmosphere and also regulates the carbon cycle. However, knowingly and unknowingly the world’s forests were deforested and degraded annually at the rate of 0.18% and emitted the carbon to the atmosphere. The IPCC reports claimed that the deforestation and forest degradation accounts 1/5th of total carbon emission, which is second position after fossil fuels. Since 1.6 billion people depend on varying degree on forests for their daily livelihood, not all deforestation are undesirable. Therefore, to conserve the forests and find the livelihood opportunities for forest surrounding people is prerequisites to address the climate change problems especially in developing countries, and also a growing concern to the forestry sector researchers, planners and policy makers. The study examines the role of community based forest management in carbon mitigation and adaptation taking the examples of Nepal’s community forestry program. In the program, the government hands over a part of national forests to the local communities with sole forest management authorities. However, the government itself retained the ownership rights of forestland. Local communities organized through a local institution called Community Forest User Group (CFUG) managed the forests. They also formed an operational plan with technical prescriptions and a constitution with forest management rules and regulations. The implementation results showed that the CFUGs are not only found effective to organize the local people and construct a local institution to forest conservation and management activities, but also they are able to collect a community fund from the sale of forest products and carried out various community development activities. These development activities have decisive roles to improve the livelihood of forest surrounding people and eventually to address the climate change problems.

Keywords: climate change, community forestry, local institution, Nepal

Procedia PDF Downloads 279
431 Safety Validation of Black-Box Autonomous Systems: A Multi-Fidelity Reinforcement Learning Approach

Authors: Jared Beard, Ali Baheri

Abstract:

As autonomous systems become more prominent in society, ensuring their safe application becomes increasingly important. This is clearly demonstrated with autonomous cars traveling through a crowded city or robots traversing a warehouse with heavy equipment. Human environments can be complex, having high dimensional state and action spaces. This gives rise to two problems. One being that analytic solutions may not be possible. The other is that in simulation based approaches, searching the entirety of the problem space could be computationally intractable, ruling out formal methods. To overcome this, approximate solutions may seek to find failures or estimate their likelihood of occurrence. One such approach is adaptive stress testing (AST) which uses reinforcement learning to induce failures in the system. The premise of which is that a learned model can be used to help find new failure scenarios, making better use of simulations. In spite of these failures AST fails to find particularly sparse failures and can be inclined to find similar solutions to those found previously. To help overcome this, multi-fidelity learning can be used to alleviate this overuse of information. That is, information in lower fidelity can simulations can be used to build up samples less expensively, and more effectively cover the solution space to find a broader set of failures. Recent work in multi-fidelity learning has passed information bidirectionally using “knows what it knows” (KWIK) reinforcement learners to minimize the number of samples in high fidelity simulators (thereby reducing computation time and load). The contribution of this work, then, is development of the bidirectional multi-fidelity AST framework. Such an algorithm, uses multi-fidelity KWIK learners in an adversarial context to find failure modes. Thus far, a KWIK learner has been used to train an adversary in a grid world to prevent an agent from reaching its goal; thus demonstrating the utility of KWIK learners in an AST framework. The next step is implementation of the bidirectional multi-fidelity AST framework described. Testing will be conducted in a grid world containing an agent attempting to reach a goal position and adversary tasked with intercepting the agent as demonstrated previously. Fidelities will be modified by adjusting the size of a time-step, with higher-fidelity effectively allowing for more responsive closed loop feedback. Results will compare the single KWIK AST learner with the multi-fidelity algorithm with respect to number of samples, distinct failure modes found, and relative effect of learning after a number of trials.

Keywords: multi-fidelity reinforcement learning, multi-fidelity simulation, safety validation, falsification

Procedia PDF Downloads 130
430 Design of Traffic Counting Android Application with Database Management System and Its Comparative Analysis with Traditional Counting Methods

Authors: Muhammad Nouman, Fahad Tiwana, Muhammad Irfan, Mohsin Tiwana

Abstract:

Traffic congestion has been increasing significantly in major metropolitan areas as a result of increased motorization, urbanization, population growth and changes in the urban density. Traffic congestion compromises efficiency of transport infrastructure and causes multiple traffic concerns; including but not limited to increase of travel time, safety hazards, air pollution, and fuel consumption. Traffic management has become a serious challenge for federal and provincial governments, as well as exasperated commuters. Effective, flexible, efficient and user-friendly traffic information/database management systems characterize traffic conditions by making use of traffic counts for storage, processing, and visualization. While, the emerging data collection technologies continue to proliferate, its accuracy can be guaranteed through the comparison of observed data with the manual handheld counters. This paper presents the design of tablet based manual traffic counting application and framework for development of traffic database management system for Pakistan. The database management system comprises of three components including traffic counting android application; establishing online database and its visualization using Google maps. Oracle relational database was chosen to develop the data structure whereas structured query language (SQL) was adopted to program the system architecture. The GIS application links the data from the database and projects it onto a dynamic map for traffic conditions visualization. The traffic counting device and example of a database application in the real-world problem provided a creative outlet to visualize the uses and advantages of a database management system in real time. Also, traffic data counts by means of handheld tablet/ mobile application can be used for transportation planning and forecasting.

Keywords: manual count, emerging data sources, traffic information quality, traffic surveillance, traffic counting device, android; data visualization, traffic management

Procedia PDF Downloads 173
429 Using Lysosomal Immunogenic Cell Death to Target Breast Cancer via Xanthine Oxidase/Micro-Antibody Fusion Protein

Authors: Iulianna Taritsa, Kuldeep Neote, Eric Fossel

Abstract:

Lysosome-induced immunogenic cell death (LIICD) is a powerful mechanism of targeting cancer cells that kills circulating malignant cells and primes the host’s immune cells against future remission. Current immunotherapies for cancer are limited in preventing recurrence – a gap that can be bridged by training the immune system to recognize cancer neoantigens. Lysosomal leakage can be induced therapeutically to traffic antigens from dying cells to dendritic cells, which can later present those tumorigenic antigens to T cells. Previous research has shown that oxidative agents administered in the tumor microenvironment can initiate LIICD. We generated a fusion protein between an oxidative agent known as xanthine oxidase (XO) and a mini-antibody specific for EGFR/HER2-sensitive breast tumor cells. The anti-EGFR single domain antibody fragment is uniquely sourced from llama, which is functional without the presence of a light chain. These llama micro-antibodies have been shown to be better able to penetrate tissues and have improved physicochemical stability as compared to traditional monoclonal antibodies. We demonstrate that the fusion protein created is stable and can induce early markers of immunogenic cell death in an in vitro human breast cancer cell line (SkBr3). Specifically, we measured overall cell death, as well as surface-expressed calreticulin, extracellular ATP release, and HMGB1 production. These markers are consensus indicators of ICD. Flow cytometry, luminescence assays, and ELISA were used respectively to quantify biomarker levels between treated versus untreated cells. We also included a positive control group of SkBr3 cells dosed with doxorubicin (a known inducer of LIICD) and a negative control dosed with cisplatin (a known inducer of cell death, but not of the immunogenic variety). We looked at each marker at various time points after cancer cells were treated with the XO/antibody fusion protein, doxorubicin, and cisplatin. Upregulated biomarkers after treatment with the fusion protein indicate an immunogenic response. We thus show the potential for this fusion protein to induce an anticancer effect paired with an adaptive immune response against EGFR/HER2+ cells. Our research in human cell lines here provides evidence for the success of the same therapeutic method for patients and serves as the gateway to developing a new treatment approach against breast cancer.

Keywords: apoptosis, breast cancer, immunogenic cell death, lysosome

Procedia PDF Downloads 180
428 A Systematic Review of the Predictors, Mediators and Moderators of the Uncanny Valley Effect in Human-Embodied Conversational Agent Interaction

Authors: Stefanache Stefania, Ioana R. Podina

Abstract:

Background: Embodied Conversational Agents (ECAs) are revolutionizing education and healthcare by offering cost-effective, adaptable, and portable solutions. Research on the Uncanny Valley effect (UVE) involves various embodied agents, including ECAs. Achieving the optimal level of anthropomorphism, no consensus on how to overcome the uncanniness problem. Objectives: This systematic review aims to identify the user characteristics, agent features, and context factors that influence the UVE. Additionally, this review provides recommendations for creating effective ECAs and conducting proper experimental studies. Methods: We conducted a systematic review following the PRISMA 2020 guidelines. We included quantitative, peer-reviewed studies that examined human-ECA interaction. We identified 17,122 relevant records from ACM Digital Library, IEE Explore, Scopus, ProQuest, and Web of Science. The quality of the predictors, mediators, and moderators adheres to the guidelines set by prior systematic reviews. Results: Based on the included studies, it can be concluded that females and younger people perceive the ECA as more attractive. However, inconsistent findings exist in the literature. ECAs characterized by extraversion, emotional stability, and agreeableness are considered more attractive. Facial expressions also play a role in the UVE, with some studies indicating that ECAs with more facial expressions are considered more attractive, although this effect is not consistent across all studies. Few studies have explored contextual factors, but they are nonetheless crucial. The interaction scenario and exposure time are important circumstances in human-ECA interaction. Conclusions: The findings highlight a growing interest in ECAs, which have seen significant developments in recent years. Given this evolving landscape, investigating the risk of the UVE can be a promising line of research.

Keywords: human-computer interaction, uncanny valley effect, embodied conversational agent, systematic review

Procedia PDF Downloads 47
427 Economic Life of Iranians on Instagram and the Disturbance in Politics

Authors: Mohammad Zaeimzade

Abstract:

The development of communication technologies is clearly and rapidly moving towards reducing the distance between the virtual and real worlds. Of course, living in a two-spatial or two-globalized world or any other interpretation that means mixing real and virtual life is still relevant and debatable. In the present age of communication, where social networks have transformed the message equation and turned the audience out of passivity and turned into a user. Platforms have penetrated widely in various aspects of human life, from culture and education and economy. Among the messengers, Instagram, which is one of the most extensive image-based interactive networks, plays a significant role in the new economic life. It doesn't need much explanation that the era of thinking of every messenger as a non-insulating conductor that is just a neutral load has passed. Every messenger has its own economic, political and of course security background, Instagram is no exception to this rule and of course it leaves its effects in bio-economics as well. Iran, as the 19th largest economy in the world, has not been unaffected by new platforms, including Instagram, and their consequences in the economy. Generally, in the policy-making space, there are two simple and inflexible pessimistic or optimistic views on this issue, and each of the holders of these views usually have their own one-dimensional policy recommendations regarding how to deal with Instagram. Prescriptions that are usually very different and sometimes contradictory. In this article, we show that this confusion of policymakers is the result of not accurately describing the reality of its effect, and the reason for this inaccurate description is the existence of a conflict of interests in the eyes of describers and researchers. In this article, we first take a look at the main indicators of the Iranian economy, estimate the role of the digital economy in Iran's economic growth, then study the conflicting descriptions of the Instagram-based digital economy, the statistics that show the tolerance of economic users of Instagram in Iran. 300 thousand to 9 million have been estimated. Finally, we take a look at the government's actions in this matter, especially in the context of street riots in October and November 2022. And we suggest an intermediate idea.

Keywords: digital economy, instagram, conflict of interest, social networks

Procedia PDF Downloads 55
426 A Study on How to Influence Players Interactive Behavior of Victory or Defeat in Party Games

Authors: Shih-Chieh Liao, Cheng-Yan Shuai

Abstract:

"Party game" is a game mode that enables players to maintain a good social and interactive experience. The common game modes include Teamwork, Team competitive, Independent competitive, Battle Royale. Party games are defined as a game with easy rules, easy to play, quickly spice up a party, and support four to six players. It also needs to let the player feel satisfied no matter victory or defeat. However, players may feel negative or angry when the game is imbalanced, especially when they play with teammates. Some players care about winning or losing, and they will blame it on the game mechanics. What is more serious is that the player will cause the argument, which is unnecessary. These behaviors that trigger quarrels and negative emotions often originate from the player's determination of the victory and the ratio of victory during the competition. In view of this, our research invited a group of subjects to the experiment, which is going to inspect player’s emotions by Electromyography (EMG) and Electrodermal Activity (EDA) when they are playing party games with others. When a player wins or loses, the negative and positive feeling will be recorded from the game beginning to the end. At the same time, physiologic and emotional reactions are also being recorded in each part of the game. The game will be designed as telling the interaction when players are in the quest of a party game. The experiment content includes the emotional changes affected by the physiological values of game victory and defeat between “player against friend” and “player against stranger.” Through this experiment, the balance between winners and losers lies in the basis of good game interaction and game interaction in the game and explore the emotional positive and negative effects caused by the result of the party game. The result shows that “player against friend” has a significant negative emotion and significant positive emotion at “player against stranger.” According to the result, the player's experience will be affected with winning rate or form when they play the party game. We suggest the developer balance the game with our experiment method to let players get a better experience.

Keywords: party games, biofeedback, emotional responses, user experience, game design

Procedia PDF Downloads 145
425 Method for Auto-Calibrate Projector and Color-Depth Systems for Spatial Augmented Reality Applications

Authors: R. Estrada, A. Henriquez, R. Becerra, C. Laguna

Abstract:

Spatial Augmented Reality is a variation of Augmented Reality where the Head-Mounted Display is not required. This variation of Augmented Reality is useful in cases where the need for a Head-Mounted Display itself is a limitation. To achieve this, Spatial Augmented Reality techniques substitute the technological elements of Augmented Reality; the virtual world is projected onto a physical surface. To create an interactive spatial augmented experience, the application must be aware of the spatial relations that exist between its core elements. In this case, the core elements are referred to as a projection system and an input system, and the process to achieve this spatial awareness is called system calibration. The Spatial Augmented Reality system is considered calibrated if the projected virtual world scale is similar to the real-world scale, meaning that a virtual object will maintain its perceived dimensions when projected to the real world. Also, the input system is calibrated if the application knows the relative position of a point in the projection plane and the RGB-depth sensor origin point. Any kind of projection technology can be used, light-based projectors, close-range projectors, and screens, as long as it complies with the defined constraints; the method was tested on different configurations. The proposed procedure does not rely on a physical marker, minimizing the human intervention on the process. The tests are made using a Kinect V2 as an input sensor and several projection devices. In order to test the method, the constraints defined were applied to a variety of physical configurations; once the method was executed, some variables were obtained to measure the method performance. It was demonstrated that the method obtained can solve different arrangements, giving the user a wide range of setup possibilities.

Keywords: color depth sensor, human computer interface, interactive surface, spatial augmented reality

Procedia PDF Downloads 100
424 A Prospective Study of a Clinically Significant Anatomical Change in Head and Neck Intensity-Modulated Radiation Therapy Using Transit Electronic Portal Imaging Device Images

Authors: Wilai Masanga, Chirapha Tannanonta, Sangutid Thongsawad, Sasikarn Chamchod, Todsaporn Fuangrod

Abstract:

The major factors of radiotherapy for head and neck (HN) cancers include patient’s anatomical changes and tumour shrinkage. These changes can significantly affect the planned dose distribution that causes the treatment plan deterioration. A measured transit EPID images compared to a predicted EPID images using gamma analysis has been clinically implemented to verify the dose accuracy as part of adaptive radiotherapy protocol. However, a global gamma analysis dose not sensitive to some critical organ changes as the entire treatment field is compared. The objective of this feasibility study is to evaluate the dosimetric response to patient anatomical changes during the treatment course in HN IMRT (Head and Neck Intensity-Modulated Radiation Therapy) using a novel comparison method; organ-of-interest gamma analysis. This method provides more sensitive to specific organ change detection. Random replanned 5 HN IMRT patients with causes of tumour shrinkage and patient weight loss that critically affect to the parotid size changes were selected and evaluated its transit dosimetry. A comprehensive physics-based model was used to generate a series of predicted transit EPID images for each gantry angle from original computed tomography (CT) and replan CT datasets. The patient structures; including left and right parotid, spinal cord, and planning target volume (PTV56) were projected to EPID level. The agreement between the transit images generated from original CT and replanned CT was quantified using gamma analysis with 3%, 3mm criteria. Moreover, only gamma pass-rate is calculated within each projected structure. The gamma pass-rate in right parotid and PTV56 between predicted transit of original CT and replan CT were 42.8%( ± 17.2%) and 54.7%( ± 21.5%). The gamma pass-rate for other projected organs were greater than 80%. Additionally, the results of organ-of-interest gamma analysis were compared with 3-dimensional cone-beam computed tomography (3D-CBCT) and the rational of replan by radiation oncologists. It showed that using only registration of 3D-CBCT to original CT does not provide the dosimetric impact of anatomical changes. Using transit EPID images with organ-of-interest gamma analysis can provide additional information for treatment plan suitability assessment.

Keywords: re-plan, anatomical change, transit electronic portal imaging device, EPID, head, and neck

Procedia PDF Downloads 198
423 Strategic Interventions to Combat Socio-economic Impacts of Drought in Thar - A Case Study of Nagarparkar

Authors: Anila Hayat

Abstract:

Pakistan is one of those developing countries that are least involved in emissions but has the most vulnerable environmental conditions. Pakistan is ranked 8th in most affected countries by climate change on the climate risk index 1992-2011. Pakistan is facing severe water shortages and flooding as a result of changes in rainfall patterns, specifically in the least developed areas such as Tharparkar. Nagarparkar, once an attractive tourist spot located in Tharparkar because of its tropical desert climate, is now facing severe drought conditions for the last few decades. This study investigates the present socio-economic situation of local communities, major impacts of droughts and their underlying causes and current mitigation strategies adopted by local communities. The study uses both secondary (quantitative in nature) and primary (qualitative in nature) methods to understand the impacts and explore causes on the socio-economic life of local communities of the study area. The relevant data has been collected through household surveys using structured questionnaires, focus groups and in-depth interviews of key personnel from local and international NGOs to explore the sensitivity of impacts and adaptation to droughts in the study area. This investigation is limited to four rural communities of union council Pilu of Nagarparkar district, including Bheel, BhojaBhoon, Mohd Rahan Ji Dhani and Yaqub Ji Dhani villages. The results indicate that drought has caused significant economic and social hardships for the local communities as more than 60% of the overall population is dependent on rainfall which has been disturbed by irregular rainfall patterns. The decline in Crop yields has forced the local community to migrate to nearby areas in search of livelihood opportunities. Communities have not undertaken any appropriate adaptive actions to counteract the adverse effect of drought; they are completely dependent on support from the government and external aid for survival. Respondents also reported that poverty is a major cause of their vulnerability to drought. An increase in population, limited livelihood opportunities, caste system, lack of interest from the government sector, unawareness shaped their vulnerability to drought and other social issues. Based on the findings of this study, it is recommended that the local authorities shall create awareness about drought hazards and improve the resilience of communities against drought. It is further suggested to develop, introduce and implement water harvesting practices at the community level to promote drought-resistant crops.

Keywords: migration, vulnerability, awareness, Drought

Procedia PDF Downloads 114