Search results for: motion data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26753

Search results for: motion data acquisition

26273 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine

Abstract:

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

Keywords: bottom elevation, MVS, river, SfM

Procedia PDF Downloads 300
26272 Video Text Information Detection and Localization in Lecture Videos Using Moments

Authors: Belkacem Soundes, Guezouli Larbi

Abstract:

This paper presents a robust and accurate method for text detection and localization over lecture videos. Frame regions are classified into text or background based on visual feature analysis. However, lecture video shows significant degradation mainly related to acquisition conditions, camera motion and environmental changes resulting in low quality videos. Hence, affecting feature extraction and description efficiency. Moreover, traditional text detection methods cannot be directly applied to lecture videos. Therefore, robust feature extraction methods dedicated to this specific video genre are required for robust and accurate text detection and extraction. Method consists of a three-step process: Slide region detection and segmentation; Feature extraction and non-text filtering. For robust and effective features extraction moment functions are used. Two distinct types of moments are used: orthogonal and non-orthogonal. For orthogonal Zernike Moments, both Pseudo Zernike moments are used, whereas for non-orthogonal ones Hu moments are used. Expressivity and description efficiency are given and discussed. Proposed approach shows that in general, orthogonal moments show high accuracy in comparison to the non-orthogonal one. Pseudo Zernike moments are more effective than Zernike with better computation time.

Keywords: text detection, text localization, lecture videos, pseudo zernike moments

Procedia PDF Downloads 153
26271 Microarray Data Visualization and Preprocessing Using R and Bioconductor

Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava

Abstract:

Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.

Keywords: microarray analysis, R language, affymetrix visualization, bioconductor

Procedia PDF Downloads 480
26270 Internet-Of-Things and Ergonomics, Increasing Productivity and Reducing Waste: A Case Study

Authors: V. Jaime Contreras, S. Iliana Nunez, S. Mario Sanchez

Abstract:

Inside a manufacturing facility, we can find innumerable automatic and manual operations, all of which are relevant to the production process. Some of these processes add more value to the products more than others. Manual operations tend to add value to the product since they can be found in the final assembly area o final operations of the process. In this areas, where a mistake or accident can increase the cost of waste exponentially. To reduce or mitigate these costly mistakes, one approach is to rely on automation to eliminate the operator from the production line - requires a hefty investment and development of specialized machinery. In our approach, the center of the solution is the operator through sufficient and adequate instrumentation, real-time reporting and ergonomics. Efficiency and reduced cycle time can be achieved thorough the integration of Internet-of-Things (IoT) ready technologies into assembly operations to enhance the ergonomics of the workstations. Augmented reality visual aids, RFID triggered personalized workstation dimensions and real-time data transfer and reporting can help achieve these goals. In this case study, a standard work cell will be used for real-life data acquisition and a simulation software to extend the data points beyond the test cycle. Three comparison scenarios will run in the work cell. Each scenario will introduce a dimension of the ergonomics to measure its impact independently. Furthermore, the separate test will determine the limitations of the technology and provide a reference for operating costs and investment required. With the ability, to monitor costs, productivity, cycle time and scrap/waste in real-time the ROI (return on investment) can be determined at the different levels to integration. This case study will help to show that ergonomics in the assembly lines can make significant impact when IoT technologies are introduced. Ergonomics can effectively reduce waste and increase productivity with minimal investment if compared with setting up to custom machine.

Keywords: augmented reality visual aids, ergonomics, real-time data acquisition and reporting, RFID triggered workstation dimensions

Procedia PDF Downloads 215
26269 Photovoltaic Modules Fault Diagnosis Using Low-Cost Integrated Sensors

Authors: Marjila Burhanzoi, Kenta Onohara, Tomoaki Ikegami

Abstract:

Faults in photovoltaic (PV) modules should be detected to the greatest extent as early as possible. For that conventional fault detection methods such as electrical characterization, visual inspection, infrared (IR) imaging, ultraviolet fluorescence and electroluminescence (EL) imaging are used, but they either fail to detect the location or category of fault, or they require expensive equipment and are not convenient for onsite application. Hence, these methods are not convenient to use for monitoring small-scale PV systems. Therefore, low cost and efficient inspection techniques with the ability of onsite application are indispensable for PV modules. In this study in order to establish efficient inspection technique, correlation between faults and magnetic flux density on the surface is of crystalline PV modules are investigated. Magnetic flux on the surface of normal and faulted PV modules is measured under the short circuit and illuminated conditions using two different sensor devices. One device is made of small integrated sensors namely 9-axis motion tracking sensor with a 3-axis electronic compass embedded, an IR temperature sensor, an optical laser position sensor and a microcontroller. This device measures the X, Y and Z components of the magnetic flux density (Bx, By and Bz) few mm above the surface of a PV module and outputs the data as line graphs in LabVIEW program. The second device is made of a laser optical sensor and two magnetic line sensor modules consisting 16 pieces of magnetic sensors. This device scans the magnetic field on the surface of PV module and outputs the data as a 3D surface plot of the magnetic flux intensity in a LabVIEW program. A PC equipped with LabVIEW software is used for data acquisition and analysis for both devices. To show the effectiveness of this method, measured results are compared to those of a normal reference module and their EL images. Through the experiments it was confirmed that the magnetic field in the faulted areas have different profiles which can be clearly identified in the measured plots. Measurement results showed a perfect correlation with the EL images and using position sensors it identified the exact location of faults. This method was applied on different modules and various faults were detected using it. The proposed method owns the ability of on-site measurement and real-time diagnosis. Since simple sensors are used to make the device, it is low cost and convenient to be sued by small-scale or residential PV system owners.

Keywords: fault diagnosis, fault location, integrated sensors, PV modules

Procedia PDF Downloads 224
26268 Understanding Knowledge Sharing and Its Effect on Creative Performance from a Dyadic Relationship Perspective

Authors: Fan Wei, Tang Yipeng

Abstract:

Knowledge sharing is of great value to organizational performance and innovation ability. However, the mainstream research has focused largely on the impact of knowledge sharing at the team level on individuals and teams. There is a lack of empirical studies on how employees interact in the exchange of knowledge and its effect on employees’ own creative performance. Based on communication accommodation theory and social exchange theory, this article explores the construction of an employee knowledge interaction mechanism under the moderating of social status and introduces the leader's creativity expectation as a moderating variable to explore its cross-level moderating effect on employee knowledge sharing and their own creative performance. An empirical test was conducted on 36 teaching and research teams in the two primary schools, and the results showed that: (1) Explicit/tacit knowledge of employees is positively correlated with acquisition of explicit/tacit knowledge; (2) Colleagues’ evaluations of employees’ social status play a moderating role between the employees’ explicit/tacit knowledge and the acquisition of explicit/tacit knowledge. (3) The leadership creativity expectation positively regulates the relationship between the employees' explicit knowledge acquisition and creative performance. This research helps to open the "black box" of the interpersonal interaction mechanism of knowledge sharing and also provides an important theoretical basis and practical guidance for organizational managers to effectively stimulate employee knowledge sharing and creative performance.

Keywords: knowledge sharing, knowledge interaction, social status, leadership creativity expectations, creative performance

Procedia PDF Downloads 121
26267 Content Analysis of Depictions of Terrorism in U.S. Major Motion Pictures: A Social Constructionist Perspective

Authors: Raleigh Blasdell, Amanda M. Sharp Parker, Lauren Waldrop, Brigid Toney

Abstract:

It has been demonstrated that fictional media sources have persuasive effects on public beliefs; this study contributes to the social constructionist literature by conducting a content analysis of U.S. major motion pictures involving terrorism. Using the Unified Film Population Sampling Methodology, the top-grossing films were identified to examine the frequency and context of several constructs of terrorism, including terrorist demographics, type of terrorism, country of origin, organizational affiliation, crime typology, and victim demographics. Comparisons of these constructs, as depicted in the films, were then made with the extant academic literature on terrorism. The data provide notable information regarding the representation of terrorism by the film industry, as well the discrepancies between the scholarly literature and depictions in popular films. The results indicate vast differences between fiction and reality, emphasizing a 'Middle Eastern Islamic male' terrorist stereotype. Using the theoretical foundation of social constructionism, the findings provide insight into how inaccurate depictions in film can influence society’s beliefs about terrorism and terrorists, which subsequently can translate into public support for legislation and policies that are often fueled by misinformation.

Keywords: film, media, social constructionism, terrorism

Procedia PDF Downloads 170
26266 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System

Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek

Abstract:

This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.

Keywords: data warehouse, GIS, MCDM, SOLAP

Procedia PDF Downloads 178
26265 PET Image Resolution Enhancement

Authors: Krzysztof Malczewski

Abstract:

PET is widely applied scanning procedure in medical imaging based research. It delivers measurements of functioning in distinct areas of the human brain while the patient is comfortable, conscious and alert. This article presents the new compression sensing based super-resolution algorithm for improving the image resolution in clinical Positron Emission Tomography (PET) scanners. The issue of motion artifacts is well known in Positron Emission Tomography (PET) studies as its side effect. The PET images are being acquired over a limited period of time. As the patients cannot hold breath during the PET data gathering, spatial blurring and motion artefacts are the usual result. These may lead to wrong diagnosis. It is shown that the presented approach improves PET spatial resolution in cases when Compressed Sensing (CS) sequences are used. Compressed Sensing (CS) aims at signal and images reconstructing from significantly fewer measurements than were traditionally thought necessary. The application of CS to PET has the potential for significant scan time reductions, with visible benefits for patients and health care economics. In this study the goal is to combine super-resolution image enhancement algorithm with CS framework to achieve high resolution PET output. Both methods emphasize on maximizing image sparsity on known sparse transform domain and minimizing fidelity.

Keywords: PET, super-resolution, image reconstruction, pattern recognition

Procedia PDF Downloads 373
26264 Quantization of Damped Systems Based on the Doubling of Degrees of Freedom

Authors: Khaled I. Nawafleh

Abstract:

In this paper, it provide the canonical approach for studying dissipated oscillators based on the doubling of degrees of freedom. Clearly, expressions for Lagrangians of the elementary modes of the system are given, which ends with the familiar classical equations of motion for the dissipative oscillator. The equation for one variable is the time reversed of the motion of the second variable. it discuss in detail the extended Bateman Lagrangian specifically for a dual extended damped oscillator time-dependent. A Hamilton-Jacobi analysis showing the equivalence with the Lagrangian approach is also obtained. For that purpose, the techniques of separation of variables were applied, and the quantization process was achieved.

Keywords: doubling of degrees of freedom, dissipated harmonic oscillator, Hamilton-Jacobi, time-dependent lagrangians, quantization

Procedia PDF Downloads 69
26263 Kinematic Modelling and Task-Based Synthesis of a Passive Architecture for an Upper Limb Rehabilitation Exoskeleton

Authors: Sakshi Gupta, Anupam Agrawal, Ekta Singla

Abstract:

An exoskeleton design for rehabilitation purpose encounters many challenges, including ergonomically acceptable wearing technology, architectural design human-motion compatibility, actuation type, human-robot interaction, etc. In this paper, a passive architecture for upper limb exoskeleton is proposed for assisting in rehabilitation tasks. Kinematic modelling is detailed for task-based kinematic synthesis of the wearable exoskeleton for self-feeding tasks. The exoskeleton architecture possesses expansion and torsional springs which are able to store and redistribute energy over the human arm joints. The elastic characteristics of the springs have been optimized to minimize the mechanical work of the human arm joints. The concept of hybrid combination of a 4-bar parallelogram linkage and a serial linkage were chosen, where the 4-bar parallelogram linkage with expansion spring acts as a rigid structure which is used to provide the rotational degree-of-freedom (DOF) required for lowering and raising of the arm. The single linkage with torsional spring allows for the rotational DOF required for elbow movement. The focus of the paper is kinematic modelling, analysis and task-based synthesis framework for the proposed architecture, keeping in considerations the essential tasks of self-feeding and self-exercising during rehabilitation of partially healthy person. Rehabilitation of primary functional movements (activities of daily life, i.e., ADL) is routine activities that people tend to every day such as cleaning, dressing, feeding. We are focusing on the feeding process to make people independent in respect of the feeding tasks. The tasks are focused to post-surgery patients under rehabilitation with less than 40% weakness. The challenges addressed in work are ensuring to emulate the natural movement of the human arm. Human motion data is extracted through motion-sensors for targeted tasks of feeding and specific exercises. Task-based synthesis procedure framework will be discussed for the proposed architecture. The results include the simulation of the architectural concept for tracking the human-arm movements while displaying the kinematic and static study parameters for standard human weight. D-H parameters are used for kinematic modelling of the hybrid-mechanism, and the model is used while performing task-based optimal synthesis utilizing evolutionary algorithm.

Keywords: passive mechanism, task-based synthesis, emulating human-motion, exoskeleton

Procedia PDF Downloads 138
26262 Augmented ADRC for Trajectory Tracking of a Novel Hydraulic Spherical Motion Mechanism

Authors: Bin Bian, Liang Wang

Abstract:

A hydraulic spherical motion mechanism (HSMM) is proposed. Unlike traditional systems using serial or parallel mechanisms for multi-DOF rotations, the HSMM is capable of implementing continuous 2-DOF rotational motions in a single joint without the intermediate transmission mechanisms. It has some advantages of compact structure, low inertia and high stiffness. However, as HSMM is a nonlinear and multivariable system, it is very complicate to realize accuracy control. Therefore, an augmented active disturbance rejection controller (ADRC) is proposed in this paper. Compared with the traditional PD control method, three compensation items, i.e., dynamics compensation term, disturbance compensation term and nonlinear error elimination term, are added into the proposed algorithm to improve the control performance. The ADRC algorithm aims at offsetting the effects of external disturbance and realizing accurate control. Euler angles are applied to describe the orientation of rotor. Lagrange equations are utilized to establish the dynamic model of the HSMM. The stability of this algorithm is validated with detailed derivation. Simulation model is formulated in Matlab/Simulink. The results show that the proposed control algorithm has better competence of trajectory tracking in the presence of uncertainties.

Keywords: hydraulic spherical motion mechanism, dynamic model, active disturbance rejection control, trajectory tracking

Procedia PDF Downloads 106
26261 Neural Network Motion Control of VTAV by NARMA-L2 Controller for Enhanced Situational Awareness

Authors: Igor Astrov, Natalya Berezovski

Abstract:

This paper focuses on a critical component of the situational awareness (SA), the control of autonomous vertical flight for vectored thrust aerial vehicle (VTAV). With the SA strategy, we proposed a neural network motion control procedure to address the dynamics variation and performance requirement difference of flight trajectory for a VTAV. This control strategy with using of NARMA-L2 neurocontroller for chosen model of VTAV has been verified by simulation of take-off and forward maneuvers using software package Simulink and demonstrated good performance for fast stabilization of motors, consequently, fast SA with economy in energy can be asserted during search-and-rescue operations.

Keywords: NARMA-L2 neurocontroller, situational awareness, vectored thrust aerial vehicle, aviation

Procedia PDF Downloads 421
26260 Hydrodynamic Performance of a Moored Barge in Irregular Wave

Authors: Srinivasan Chandrasekaran, Shihas A. Khader

Abstract:

Motion response of floating structures is of great concern in marine engineering. Nonlinearity is an inherent property of any floating bodies subjected to irregular waves. These floating structures are continuously subjected to environmental loadings from wave, current, wind etc. This can result in undesirable motions of the vessel which may challenge the operability. For a floating body to remain in its position, it should be able to induce a restoring force when displaced. Mooring is provided to enable this restoring force. This paper discuss the hydrodynamic performance and motion characteristics of an 8 point spread mooring system applied to a pipe laying barge operating in the West African sea. The modelling of the barge is done using a computer aided-design (CAD) software RHINOCEROS. Irregular waves are generated using a suitable wave spectrum. Both frequency domain and time domain analysis is done. Numerical simulations based on potential theory are carried out to find the responses and hydrodynamic performance of the barge in both free floating as well as moored conditions. Initially, potential flow frequency domain analysis is done to obtain the Response Amplitude Operator (RAO) which gives an idea about the structural motion in free floating state. RAOs for different wave headings are analyzed. In the following step, a time domain analysis is carried out to obtain the responses of the structure in the moored condition. In this study, wave induced motions are only taken into consideration. Wind and current loads are ruled out and shall be included in future studies. For the current study, 5000 seconds simulation is taken. The results represent wave-induced motion responses, mooring line tensions and identifies critical mooring lines.

Keywords: irregular wave, moored barge, time domain analysis, numerical simulation

Procedia PDF Downloads 254
26259 A Forearm-Wrist Rehabilitation Module for Stroke and Spinal Cord Injuries

Authors: Vahid Mehrabi, Iman Sharifi, H. A. Talebi

Abstract:

The automation of rehabilitation procedure by the implementation of robotic devices can overcome the limitation in conventional physiotherapy methods by increasing training sessions and duration of process. In this paper, the design of a simple rehabilitation robot for forearm-wrist therapy in stroke and spinal cord injuries is presented. Wrist’s biological joint motion is modeled by a gimbal-like mechanism which resembles the human arm anatomy. Presented device is an exoskeleton robot with rotation axes corresponding to human skeleton anatomy. The mechanical structure, actuator and sensor selection, system kinematics and comparison between our device range of motion and required active daily life values is illustrated.

Keywords: rehabilitation, robotic devices, physiotherapy, forearm-wrist

Procedia PDF Downloads 285
26258 The Acquisition of Case in Biological Domain Based on Text Mining

Authors: Shen Jian, Hu Jie, Qi Jin, Liu Wei Jie, Chen Ji Yi, Peng Ying Hong

Abstract:

In order to settle the problem of acquiring case in biological related to design problems, a biometrics instance acquisition method based on text mining is presented. Through the construction of corpus text vector space and knowledge mining, the feature selection, similarity measure and case retrieval method of text in the field of biology are studied. First, we establish a vector space model of the corpus in the biological field and complete the preprocessing steps. Then, the corpus is retrieved by using the vector space model combined with the functional keywords to obtain the biological domain examples related to the design problems. Finally, we verify the validity of this method by taking the example of text.

Keywords: text mining, vector space model, feature selection, biologically inspired design

Procedia PDF Downloads 262
26257 Improving Rural Access to Specialist Emergency Mental Health Care: Using a Time and Motion Study in the Evaluation of a Telepsychiatry Program

Authors: Emily Saurman, David Lyle

Abstract:

In Australia, a well serviced rural town might have a psychiatrist visit once-a-month with more frequent visits from a psychiatric nurse, but many have no resident access to mental health specialists. Access to specialist care, would not only reduce patient distress and benefit outcomes, but facilitate the effective use of limited resources. The Mental Health Emergency Care-Rural Access Program (MHEC-RAP) was developed to improve access to specialist emergency mental health care in rural and remote communities using telehealth technologies. However, there has been no current benchmark to gauge program efficiency or capacity; to determine whether the program activity is justifiably sufficient. The evaluation of MHEC-RAP used multiple methods and applied a modified theory of access to assess the program and its aim of improved access to emergency mental health care. This was the first evaluation of a telepsychiatry service to include a time and motion study design examining program time expenditure, efficiency, and capacity. The time and motion study analysis was combined with an observational study of the program structure and function to assess the balance between program responsiveness and efficiency. Previous program studies have demonstrated that MHEC-RAP has improved access and is used and effective. The findings from the time and motion study suggest that MHEC-RAP has the capacity to manage increased activity within the current model structure without loss to responsiveness or efficiency in the provision of care. Enhancing program responsiveness and efficiency will also support a claim of the program’s value for money. MHEC-RAP is a practical telehealth solution for improving access to specialist emergency mental health care. The findings from this evaluation have already attracted the attention of other regions in Australia interested in implementing emergency telepsychiatry programs and are now informing the progressive establishment of mental health resource centres in rural New South Wales. Like MHEC-RAP, these centres will provide rapid, safe, and contextually relevant assessments and advice to support local health professionals to manage mental health emergencies in the smaller rural emergency departments. Sharing the application of this methodology and research activity may help to improve access to and future evaluations of telehealth and telepsychiatry services for others around the globe.

Keywords: access, emergency, mental health, rural, time and motion

Procedia PDF Downloads 235
26256 A Case Study on Performance of Isolated Bridges under Near-Fault Ground Motion

Authors: Daniele Losanno, H. A. Hadad, Giorgio Serino

Abstract:

This paper presents a numerical investigation on the seismic performance of a benchmark bridge with different optimal isolation systems under near fault ground motion. Usually, very large displacements make seismic isolation an unfeasible solution due to boundary conditions, especially in case of existing bridges or high risk seismic regions. Hence, near-fault ground motions are most likely to affect either structures with long natural period range like isolated structures or structures sensitive to velocity content such as viscously damped structures. The work is aimed at analyzing the seismic performance of a three-span continuous bridge designed with different isolation systems having different levels of damping. The case study was analyzed in different configurations including: (a) simply supported, (b) isolated with lead rubber bearings (LRBs), (c) isolated with rubber isolators and 10% classical damping (HDLRBs), and (d) isolated with rubber isolators and 70% supplemental damping ratio. Case (d) represents an alternative control strategy that combines the effect of seismic isolation with additional supplemental damping trying to take advantages from both solutions. The bridge is modeled in SAP2000 and solved by time history direct-integration analyses under a set of six recorded near-fault ground motions. In addition to this, a set of analysis under Italian code provided seismic action is also conducted, in order to evaluate the effectiveness of the suggested optimal control strategies under far field seismic action. Results of the analysis demonstrated that an isolated bridge equipped with HDLRBs and a total equivalent damping ratio of 70% represents a very effective design solution for both mitigation of displacement demand at the isolation level and base shear reduction in the piers also in case of near fault ground motion.

Keywords: isolated bridges, near-fault motion, seismic response, supplemental damping, optimal design

Procedia PDF Downloads 286
26255 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method

Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens

Abstract:

Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.

Keywords: healthcare, knowledge acquisition, maximal data sets, action design science

Procedia PDF Downloads 367
26254 Influence and Dissemination of Solecism among Moroccan High School and University Students

Authors: Rachid Ed-Dali, Khalid Elasri

Abstract:

Mass media seem to provide a rich content for language acquisition. Exposure to television, the Internet, the mobile phone and other technological gadgets and devices helps enrich the student’s lexicon positively as well as negatively. The difficulties encountered by students while learning and acquiring second languages in addition to their eagerness to comprehend the content of a particular program prompt them to diversify their methods so as to achieve their targets. The present study highlights the significance of certain media channels and their involvement in language acquisition with the employment of the Natural Approach to further grasp whether students, especially secondary and high school students, learn and acquire errors through watching subtitled television programs. The chief objective is investigating the deductive and inductive relevance of certain programs beside the involvement of peripheral learning while acquiring mistakes.

Keywords: errors, mistakes, Natural Approach, peripheral learning, solecism

Procedia PDF Downloads 118
26253 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 490
26252 Corpus Stylistics and Multidimensional Analysis for English for Specific Purposes Teaching and Assessment

Authors: Svetlana Strinyuk, Viacheslav Lanin

Abstract:

Academic English has become lingua franca for international scientific community which stimulates universities to introduce English for Specific Purposes (EAP) courses into curriculum. Teaching L2 EAP students might be fulfilled with corpus technologies and digital stylistics. A special software developed to reach the manifold task of teaching, assessing and researching academic writing of L2 students on basis of digital stylistics and multidimensional analysis was created. A set of annotations (style markers) – grammar, lexical and syntactic features most significant of academic writing was built. Contrastive comparison of two corpora “model corpus”, subject domain limited papers published by competent writers in leading academic journals, and “students’ corpus”, subject domain limited papers written by last year students allows to receive data about the features of academic writing underused or overused by L2 EAP student. Both corpora are tagged with a special software created in GATE Developer. Style markers within the framework of research might be replaced depending on the relevance and validity of the result which is achieved from research corpora. Thus, selecting relevant (high frequency) style markers and excluding less relevant, i.e. less frequent annotations, high validity of the model is achieved. Software allows to compare the data received from processing model corpus to students’ corpus and get reports which can be used in teaching and assessment. The less deviation from the model corpus students demonstrates in their writing the higher is academic writing skill acquisition. The research showed that several style markers (hedging devices) were underused by L2 EAP students whereas lexical linking devices were used excessively. A special software implemented into teaching of EAP courses serves as a successful visual aid, makes assessment more valid; it is indicative of the degree of writing skill acquisition, and provides data for further research.

Keywords: corpus technologies in EAP teaching, multidimensional analysis, GATE Developer, corpus stylistics

Procedia PDF Downloads 202
26251 Plantation Forests Height Mapping Using Unmanned Aerial System

Authors: Shiming Li, Qingwang Liu, Honggan Wu, Jianbing Zhang

Abstract:

Plantation forests are useful for timber production, recreation, environmental protection and social development. Stands height is an important parameter for the estimation of forest volume and carbon stocks. Although lidar is suitable technology for the vertical parameters extraction of forests, but high costs make it not suitable for operational inventory. With the development of computer vision and photogrammetry, aerial photos from unmanned aerial system can be used as an alternative solution for height mapping. Structure-from-motion (SfM) photogrammetry technique can be used to extract DSM and DEM information. Canopy height model (CHM) can be achieved by subtraction DEM from DSM. Our result shows that overlapping aerial photos is a potential solution for plantation forests height mapping.

Keywords: forest height mapping, plantation forests, structure-from-motion photogrammetry, UAS

Procedia PDF Downloads 278
26250 3D Seismic Acquisition Challenges in the NW Ghadames Basin Libya, an Integrated Geophysical Sedimentological and Subsurface Studies Approach as a Solution

Authors: S. Sharma, Gaballa Aqeelah, Tawfig Alghbaili, Ali Elmessmari

Abstract:

There were abrupt discontinuities in the Brute Stack in the northernmost locations during the acquisition of 2D (2007) and 3D (2021) seismic data in the northwest region of the Ghadames Basin, Libya. In both campaigns, complete fluid circulation loss was seen in these regions during up-hole drilling. Geophysics, sedimentology and shallow subsurface geology were all integrated to look into what was causing the seismic signal to disappear at shallow depths. The Upper Cretaceous Nalut Formation is the near-surface or surface formation in the studied area. It is distinguished by abnormally high resistivity in all the neighboring wells. The Nalut Formation in all the nearby wells from the present study and previous outcrop study suggests lithology of dolomite and chert/flint in nodular or layered forms. There are also reports of karstic caverns, vugs, and thick cracks, which all work together to produce the high resistivity. Four up-hole samples that were analyzed for microfacies revealed a near-coastal to tidal environment. Algal (Chara) infested deposits up to 30 feet thick and monotonous, very porous, are seen in two up-hole sediments; these deposits are interpreted to be scattered, continental algal travertine mounds. Chert/flint, dolomite, and calcite in varying amounts are confirmed by XRD analysis. Regional tracking of the high resistivity of the Nalut Formation, which is thought to be connected to the sea level drop that created the paleokarst layer, is possible. It is abruptly overlain by a blanket marine transgressive deposit caused by rapid sea level rise, which is a regional, relatively high radioactive layer of argillaceous limestone. The examined area's close proximity to the mountainous, E-W trending ridges of northern Libya made it easier for recent freshwater circulation, which later enhanced cavern development and mineralization in the paleokarst layer. Seismic signal loss at shallow depth is caused by extremely heterogeneous mineralogy of pore- filling or lack thereof. Scattering effect of shallow karstic layer on seismic signal has been well documented. Higher velocity inflection points at shallower depths in the northern part and deeper intervals in the southern part, in both cases at Nalut level, demonstrate the layer's influence on the seismic signal. During the Permian-Carboniferous, the Ghadames Basin underwent uplift and extensive erosion, which resulted in this karstic layer of the Nalut Formation uplifted to a shallow depth in the northern part of the studied area weakening the acoustic signal, whereas in the southern part of the 3D acquisition area the Nalut Formation remained at the deeper interval without affecting the seismic signal. Results from actions taken during seismic processing to deal with this signal loss are visible and have improved. This study recommends using denser spacing or dynamite to circumvent the karst layer in a comparable geographic area in order to prevent signal loss at lesser depths.

Keywords: well logging, seismic data acquisition, sesimic data processing, up-holes

Procedia PDF Downloads 86
26249 Mathematical Description of Functional Motion and Application as a Feeding Mode for General Purpose Assistive Robots

Authors: Martin Leroux, Sylvain Brisebois

Abstract:

Eating a meal is among the Activities of Daily Living, but it takes a lot of time and effort for people with physical or functional limitations. Dedicated technologies are cumbersome and not portable, while general-purpose assistive robots such as wheelchair-based manipulators are too hard to control for elaborate continuous motion like eating. Eating with such devices has not previously been automated, since there existed no description of a feeding motion for uncontrolled environments. In this paper, we introduce a feeding mode for assistive manipulators, including a mathematical description of trajectories for motions that are difficult to perform manually such as gathering and scooping food at a defined/desired pace. We implement these trajectories in a sequence of movements for a semi-automated feeding mode which can be controlled with a very simple 3-button interface, allowing the user to have control over the feeding pace. Finally, we demonstrate the feeding mode with a JACO robotic arm and compare the eating speed, measured in bites per minute of three eating methods: a healthy person eating unaided, a person with upper limb limitations or disability using JACO with manual control, and a person with limitations using JACO with the feeding mode. We found that the feeding mode allows eating about 5 bites per minute, which should be sufficient to eat a meal under 30min.

Keywords: assistive robotics, automated feeding, elderly care, trajectory design, human-robot interaction

Procedia PDF Downloads 162
26248 Stimulus-Response and the Innateness Hypothesis: Childhood Language Acquisition of “Genie”

Authors: Caroline Kim

Abstract:

Scholars have long disputed the relationship between the origins of language and human behavior. Historically, behaviorist psychologist B. F. Skinner argued that language is one instance of the general stimulus-response phenomenon that characterizes the essence of human behavior. Another, more recent approach argues, by contrast, that language is an innate cognitive faculty and does not arise from behavior, which might develop and reinforce linguistic facility but is not its source. Pinker, among others, proposes that linguistic defects arise from damage to the brain, both congenital and acquired in life. Much of his argument is based on case studies in which damage to the Broca’s and Wernicke’s areas of the brain results in loss of the ability to produce coherent grammatical expressions when speaking or writing; though affected speakers often utter quite fluent streams of sentences, the words articulated lack discernible semantic content. Pinker concludes on this basis that language is an innate component of specific, classically language-correlated regions of the human brain. Taking a notorious 1970s case of linguistic maladaptation, this paper queries the dominant materialist paradigm of language-correlated regions. Susan “Genie” Wiley was physically isolated from language interaction in her home and beaten by her father when she attempted to make any sort of sound. Though without any measurable resulting damage to the brain, Wiley was never able to develop the level of linguistic facility normally achieved in adulthood. Having received a negative reinforcement of language acquisition from her father and lacking the usual language acquisition period, in adulthood Wiley was able to develop language only at a quite limited level in later life. From a contemporary behaviorist perspective, this case confirms the possibility of language deficiency without brain pathology. Wiley’s potential language-determining areas in the brain were intact, and she was exposed to language later in her life, but she was unable to achieve the normal level of communication skills, deterring socialization. This phenomenon and others like it in the case limited literature on linguistic maladaptation pose serious clinical, scientific, and indeed philosophical difficulties for both of the major competing theories of language acquisition, innateness, and linguistic stimulus-response. The implications of such cases for future research in language acquisition are explored, with a particular emphasis on the interaction of innate capacity and stimulus-based development in early childhood.

Keywords: behaviorism, innateness hypothesis, language, Susan "Genie" Wiley

Procedia PDF Downloads 294
26247 The Phonology and Phonetics of Second Language Intonation in Case of “Downstep”

Authors: Tayebeh Norouzi

Abstract:

This study aims to investigate the acquisition process of intonation. It examines the intonation structure of Tokyo Japanese and its realization by Iranian learners of Japanese. Seven Iranian learners of Japanese, differing in fluency, and two Japanese speakers participated in the experiment. Two sentences were used to test the phonological and phonetic characteristics of lexical pitch-accent as well as the intonation patterns produced by the speakers. Both sentences consisted of similar words with the same number of syllables and lexical pitch-accents but different syntactic structure. Speakers were asked to read each sentence three times at normal speed, and the data were analyzed by Praat. The results show that lexical pitch-accent, Accentual Phrase (AP) and AP boundary tone realization vary depending on sentence type. For sentences of type XdeYwo, the lexical pitch-accent is realized properly. However, there is a rise in AP boundary tone regardless of speakers’ level of fluency. In contrast, in sentences of type XnoYwo, the lexical pitch-accent and AP boundary tone vary depending on the speakers’ fluency level. Advanced speakers are better at grouping words into phrases and produce more native-like intonation patterns, though they are not able to realize downstep properly. The non-native speakers tried to realize proper intonation patterns by making changes in lexical accent and boundary tone.

Keywords: intonation, Iranian learners, Japanese prosody, lexical accent, second language acquisition.

Procedia PDF Downloads 170
26246 Determination of the Gain in Learning the Free-Fall Motion of Bodies by Applying the Resource of Previous Concepts

Authors: Ricardo Merlo

Abstract:

In this paper, we analyzed the different didactic proposals for teaching about the free fall motion of bodies available online. An important aspect was the interpretation of the direction and sense of the acceleration of gravity and of the falling velocity of a body, which is why we found different applications of the Cartesian reference system used and also different graphical presentations of the velocity as a function of time and of the distance traveled vertically by the body in the period of time that it was dropped from a height h0. In this framework, a survey of previous concepts was applied to a voluntary group of first-year university students of an Engineering degree before and after the development of the class of the subject in question. Then, Hake's index (0.52) was determined, which resulted in an average learning gain from the meaningful use of the reference system and the respective graphs of v=ƒ (t) and h=ƒ (t).

Keywords: didactic gain, free–fall, physics teaching, previous knowledge

Procedia PDF Downloads 163
26245 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 107
26244 Verification of a Simple Model for Rolling Isolation System Response

Authors: Aarthi Sridhar, Henri Gavin, Karah Kelly

Abstract:

Rolling Isolation Systems (RISs) are simple and effective means to mitigate earthquake hazards to equipment in critical and precious facilities, such as hospitals, network collocation facilities, supercomputer centers, and museums. The RIS works by isolating components acceleration the inertial forces felt by the subsystem. The RIS consists of two platforms with counter-facing concave surfaces (dishes) in each corner. Steel balls lie inside the dishes and allow the relative motion between the top and bottom platform. Formerly, a mathematical model for the dynamics of RISs was developed using Lagrange’s equations (LE) and experimentally validated. A new mathematical model was developed using Gauss’s Principle of Least Constraint (GPLC) and verified by comparing impulse response trajectories of the GPLC model and the LE model in terms of the peak displacements and accelerations of the top platform. Mathematical models for the RIS are tedious to derive because of the non-holonomic rolling constraints imposed on the system. However, using Gauss’s Principle of Least constraint to find the equations of motion removes some of the obscurity and yields a system that can be easily extended. Though the GPLC model requires more state variables, the equations of motion are far simpler. The non-holonomic constraint is enforced in terms of accelerations and therefore requires additional constraint stabilization methods in order to avoid the possibility that numerical integration methods can cause the system to go unstable. The GPLC model allows the incorporation of more physical aspects related to the RIS, such as contribution of the vertical velocity of the platform to the kinetic energy and the mass of the balls. This mathematical model for the RIS is a tool to predict the motion of the isolation platform. The ability to statistically quantify the expected responses of the RIS is critical in the implementation of earthquake hazard mitigation.

Keywords: earthquake hazard mitigation, earthquake isolation, Gauss’s Principle of Least Constraint, nonlinear dynamics, rolling isolation system

Procedia PDF Downloads 252