Search results for: motion capture
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2414

Search results for: motion capture

494 Milling Simulations with a 3-DOF Flexible Planar Robot

Authors: Hoai Nam Huynh, Edouard Rivière-Lorphèvre, Olivier Verlinden

Abstract:

Manufacturing technologies are becoming continuously more diversified over the years. The increasing use of robots for various applications such as assembling, painting, welding has also affected the field of machining. Machining robots can deal with larger workspaces than conventional machine-tools at a lower cost and thus represent a very promising alternative for machining applications. Furthermore, their inherent structure ensures them a great flexibility of motion to reach any location on the workpiece with the desired orientation. Nevertheless, machining robots suffer from a lack of stiffness at their joints restricting their use to applications involving low cutting forces especially finishing operations. Vibratory instabilities may also happen while machining and deteriorate the precision leading to scrap parts. Some researchers are therefore concerned with the identification of optimal parameters in robotic machining. This paper continues the development of a virtual robotic machining simulator in order to find optimized cutting parameters in terms of depth of cut or feed per tooth for example. The simulation environment combines an in-house milling routine (DyStaMill) achieving the computation of cutting forces and material removal with an in-house multibody library (EasyDyn) which is used to build a dynamic model of a 3-DOF planar robot with flexible links. The position of the robot end-effector submitted to milling forces is controlled through an inverse kinematics scheme while controlling the position of its joints separately. Each joint is actuated through a servomotor for which the transfer function has been computed in order to tune the corresponding controller. The output results feature the evolution of the cutting forces when the robot structure is deformable or not and the tracking errors of the end-effector. Illustrations of the resulting machined surfaces are also presented. The consideration of the links flexibility has highlighted an increase of the cutting forces magnitude. This proof of concept will aim to enrich the database of results in robotic machining for potential improvements in production.

Keywords: control, milling, multibody, robotic, simulation

Procedia PDF Downloads 228
493 Density Measurement of Underexpanded Jet Using Stripe Patterned Background Oriented Schlieren Method

Authors: Shinsuke Udagawa, Masato Yamagishi, Masanori Ota

Abstract:

The Schlieren method, which has been conventionally used to visualize high-speed flows, has disadvantages such as the complexity of the experimental setup and the inability to quantitatively analyze the amount of refraction of light. The Background Oriented Schlieren (BOS) method proposed by Meier is one of the measurement methods that solves the problems, as mentioned above. The refraction of light is used for BOS method same as the Schlieren method. The BOS method is characterized using a digital camera to capture the images of the background behind the observation area. The images are later analyzed by a computer to quantitatively detect the amount of shift of the background image. The experimental setup for BOS does not require concave mirrors, pinholes, or color filters, which are necessary in the conventional Schlieren method, thus simplifying the experimental setup. However, the defocusing of the observation results is caused in case of using BOS method. Since the focus of camera on the background image leads to defocusing of the observed object. The defocusing of object becomes greater with increasing the distance between the background and the object. On the other hand, the higher sensitivity can be obtained. Therefore, it is necessary to adjust the distance between the background and the object to be appropriate for the experiment, considering the relation between the defocus and the sensitivity. The purpose of this study is to experimentally clarify the effect of defocus on density field reconstruction. In this study, the visualization experiment of underexpanded jet using BOS measurement system with ronchi ruling as the background that we constructed, have been performed. The reservoir pressure of the jet and the distance between camera and axis of jet is fixed, and the distance between background and axis of jet has been changed as the parameter. The images have been later analyzed by using personal computer to quantitatively detect the amount of shift of the background image from the comparison between the background pattern and the captured image of underexpanded jet. The quantitatively measured amount of shift have been reconstructed into a density flow field using the Abel transformation and the Gradstone-Dale equation. From the experimental results, it is found that the reconstructed density image becomes blurring, and noise becomes decreasing with increasing the distance between background and axis of underexpanded jet. Consequently, it is cralified that the sensitivity constant should be greater than 20, and the circle of confusion diameter should be less than 2.7mm at least in this experimental setup.

Keywords: BOS method, underexpanded jet, abel transformation, density field visualization

Procedia PDF Downloads 47
492 The Role of Leisure in Older Adults Transitioning to New Homes

Authors: Kristin Prentice, Carri Hand

Abstract:

As the Canadian population ages and chronic health conditions continue to escalate, older adults will require various types of housing, such as long term care or retirement homes. Moving to a new home may require a change in leisure activities and social networks, which could be challenging to maintain identity and create a sense of home. Leisure has been known to help older adults maintain or increase their quality of life and life satisfaction and may help older adults in moving to new homes. Sense of home and identity within older adults' transitions to new homes are concepts that may also relate to leisure engagement. Literature is scant regarding the role of leisure in older adults moving to new homes and how the sense of home and identity inter-relate. This study aims to explore how leisure may play a role in older adults' transitioning to new homes, including how sense of home and identity inter-relate. An ethnographic approach will be used to understand the culture of older adults transitioning to new homes. This study will involve older adults who have recently relocated to a mid-sized city in Ontario, Canada. The study will focus on the older adult’s interactions with and connections to their home environment through leisure. Data collection will take place via video-conferencing and will include a narrative interview and two other interviews to discuss an activity diary of leisure engagement pre and post move and mental maps to capture spaces where participants engaged in leisure. Participants will be encouraged to share photographs of leisure engagement taken inside and outside their home to help understand the social spaces the participants refer to in their activity diaries and mental maps. Older adults attempt to adjust to their new homes by maintaining their identity, developing a sense of home through creating attachment to place, and maintaining social networks, all of which have been linked to engaging in leisure. This research will provide insight into the role of leisure in this transition process and the extent that the home and community can contribute to aiding their transition to the new home. This research will contribute to existing literature on the inter-relationships of leisure, sense of home, and identity and how they relate to older adults moving to new homes. This research also has potential for influencing policy and practice for meeting the housing needs of older adults.

Keywords: leisure, older adults, transition, identity

Procedia PDF Downloads 95
491 Parametrical Analysis of Stain Removal Performance of a Washing Machine: A Case Study of Sebum

Authors: Ozcan B., Koca B., Tuzcuoglu E., Cavusoglu S., Efe A., Bayraktar S.

Abstract:

A washing machine is mainly used for removing any types of dirt and stains and also eliminating malodorous substances from textile surfaces. Stains originate from various sources from the human body to environmental contamination. Therefore, there are various methods for removing them. They are roughly classified into four different groups: oily (greasy) stains, particulate stains, enzymatic stains and bleachable (oxidizable) stains. Oily stains on clothes surfaces are a common result of being in contact with organic substances of the human body (e.g. perspiration, skin shedding and sebum) or by being exposed to an oily environmental pollutant (e.g. oily foods). Studies showed that human sebum is major component of oily soil found on the garments, and if it is aged under the several environmental conditions, it can generate obstacle yellow stains on the textile surface. In this study, a parametric study was carried out to investigate the key factors affecting the cleaning performance (specifically sebum removal performance) of a washing machine. These parameters are mechanical agitation percentage of tumble, consumed water and total washing period. A full factorial design of the experiment is used to capture all the possible parametric interactions using Minitab 2021 statistical program. Tests are carried out with commercial liquid detergent and 2 different types of sebum-soiled cotton and cotton + polyester fabrics. Parametric results revealed that for both test samples, increasing the washing time and the mechanical agitation could lead to a much better removal result of sebum. However, for each sample, the water amount had different outcomes. Increasing the water amount decreases the performance of cotton + polyester fabrics, while it is favorable for cotton fabric. Besides this, it was also discovered that the type of textile can greatly affect the sebum removal performance. Results showed that cotton + polyester fabrics are much easier to clean compared to cotton fabric

Keywords: laundry, washing machine, low-temperature washing, cold wash, washing efficiency index, sustainability, cleaning performance, stain removal, oily soil, sebum, yellowing

Procedia PDF Downloads 113
490 Deploying a Transformative Learning Model in Technological University Dublin to Assess Transversal Skills

Authors: Sandra Thompson, Paul Dervan

Abstract:

Ireland’s first Technological University (TU Dublin) was established on 1st January 2019, and its creation is an exciting new milestone in Irish Higher Education. TU Dublin is now Ireland’s biggest University supporting 29,000 students across three campuses with 3,500 staff. The University aspires to create work-ready graduates who are socially responsible, open-minded global thinkers who are ambitious to change the world for the better. As graduates, they will be enterprising and daring in all their endeavors, ready to play their part in transforming the future. Feedback from Irish employers and students coupled with evidence from other authoritative sources such as the World Economic Forum points to a need for greater focus on the development of students’ employability skills as they prepare for today’s work environment. Moreover, with an increased focus on Universal Design for Learning (UDL) and inclusiveness, there is recognition that students are more than a numeric grade value. Robust grading systems have been developed to track a student’s performance around discipline knowledge but there is little or no global consensus on a definition of transversal skills nor on a unified framework to assess transversal skills. Education and industry sectors are often assessing one or two skills, and some are developing their own frameworks to capture the learner’s achievement in this area. Technological University Dublin (TU Dublin) have discovered and implemented a framework to allow students to develop, assess and record their transversal skills using transformative learning theory. The model implemented is an adaptation of Student Transformative Learning Record - STLR which originated in the University of Central Oklahoma (UCO). The purpose of this paper therefore, is to examine the views of students, staff and employers in the context of deploying a Transformative Learning model within the University to assess transversal skills. It will examine the initial impact the transformative learning model is having socially, personally and on the University as an organization. Crucially also, to identify lessons learned from the deployment in order to assist other Universities and Higher Education Institutes who may be considering a focused adoption of Transformative Learning to meet the challenge of preparing students for today’s work environment.

Keywords: assessing transversal skills, higher education, transformative learning, students

Procedia PDF Downloads 111
489 Ni-W-P Alloy Coating as an Alternate to Electroplated Hard Cr Coating

Authors: S. K. Ghosh, C. Srivastava, P. K. Limaye, V. Kain

Abstract:

Electroplated hard chromium is widely known in coatings and surface finishing, automobile and aerospace industries because of its excellent hardness, wear resistance and corrosion properties. However, its precursor, Cr+6 is highly carcinogenic in nature and a consensus has been adopted internationally to eradicate this coating technology with an alternative one. The search for alternate coatings to electroplated hard chrome is continuing worldwide. Various alloys and nanocomposites like Co-W alloys, Ni-Graphene, Ni-diamond nanocomposites etc. have already shown promising results in this regard. Basically, in this study, electroless Ni-P alloys with excellent corrosion resistance was taken as the base matrix and incorporation of tungsten as third alloying element was considered to improve the hardness and wear resistance of the resultant alloy coating. The present work is focused on the preparation of Ni–W–P coatings by electrodeposition with different content of phosphorous and its effect on the electrochemical, mechanical and tribological performances. The results were also compared with Ni-W alloys. Composition analysis by EDS showed deposition of Ni-32.85 wt% W-3.84 wt% P (designated as Ni-W-LP) and Ni-18.55 wt% W-8.73 wt% P (designated as Ni-W-HP) alloy coatings from electrolytes containing of 0.006 and 0.01M sodium hypophosphite respectively. Inhibition of tungsten deposition in the presence of phosphorous was noted. SEM investigation showed cauliflower like growth along with few microcracks. The as-deposited Ni-W-P alloy coating was amorphous in nature as confirmed by XRD investigation and step-wise crystallization was noticed upon annealing at higher temperatures. For all the coatings, the nanohardness was found to increase after heat-treatment and typical nanonahardness values obtained for 400°C annealed samples were 18.65±0.20 GPa, 20.03±0.25 GPa, and 19.17±0.25 for alloy coatings Ni-W, Ni-W-LP and Ni-W-HP respectively. Therefore, the nanohardness data show very promising results. Wear and coefficient of friction data were recorded by applying a different normal load in reciprocating motion using a ball on plate geometry. Post experiment, the wear mechanism was established by detail investigation of wear-scar morphology. Potentiodynamic measurements showed coating with a high content of phosphorous was most corrosion resistant in 3.5wt% NaCl solution.

Keywords: corrosion, electrodeposition, nanohardness, Ni-W-P alloy coating

Procedia PDF Downloads 331
488 A Single-Use Endoscopy System for Identification of Abnormalities in the Distal Oesophagus of Individuals with Chronic Reflux

Authors: Nafiseh Mirabdolhosseini, Jerry Zhou, Vincent Ho

Abstract:

The dramatic global rise in acid reflux has also led to oesophageal adenocarcinoma (OAC) becoming the fastest-growing cancer in developed countries. While gastroscopy with biopsy is used to diagnose OAC patients, this labour-intensive and expensive process is not suitable for population screening. This study aims to design, develop, and implement a minimally invasive system to capture optical data of the distal oesophagus for rapid screening of potential abnormalities. To develop the system and understand user requirements, a user-centric approach was employed by utilising co-design strategies. Target users’ segments were identified, and 38 patients and 14 health providers were interviewed. Next, the technical requirements were developed based on consultations with the industry. A minimally invasive optical system was designed and developed considering patient comfort. This system consists of the sensing catheter, controller unit, and analysis program. Its procedure only takes 10 minutes to perform and does not require cleaning afterward since it has a single-use catheter. A prototype system was evaluated for safety and efficacy for both laboratory and clinical performance. This prototype performed successfully when submerged in simulated gastric fluid without showing evidence of erosion after 24 hours. The system effectively recorded a video of the mid-distal oesophagus of a healthy volunteer (34-year-old male). The recorded images were used to develop an automated program to identify abnormalities in the distal oesophagus. Further data from a larger clinical study will be used to train the automated program. This system allows for quick visual assessment of the lower oesophagus in primary care settings and can serve as a screening tool for oesophageal adenocarcinoma. In addition, this system is able to be coupled with 24hr ambulatory pH monitoring to better correlate oesophageal physiological changes with reflux symptoms. It also can provide additional information on lower oesophageal sphincter functions such as opening times and bolus retention.

Keywords: endoscopy, MedTech, oesophageal adenocarcinoma, optical system, screening tool

Procedia PDF Downloads 67
487 Comparison between the Roller-Foam and Neuromuscular Facilitation Stretching on Flexibility of Hamstrings Muscles

Authors: Paolo Ragazzi, Olivier Peillon, Paul Fauris, Mathias Simon, Raul Navarro, Juan Carlos Martin, Oriol Casasayas, Laura Pacheco, Albert Perez-Bellmunt

Abstract:

Introduction: The use of stretching techniques in the sports world is frequent and widely used for its many effects. One of the main benefits is the gain in flexibility, range of motion and facilitation of the sporting performance. Recently the use of Roller-Foam (RF) has spread in sports practice both at elite and recreational level for its benefits being similar to those observed in stretching. The objective of the following study is to compare the results of the Roller-Foam with the proprioceptive neuromuscular facilitation stretching (PNF) (one of the stretchings with more evidence) on the hamstring muscles. Study design: The design of the study is a single-blind, randomized controlled trial and the participants are 40 healthy volunteers. Intervention: The subjects are distributed randomly in one of the following groups; stretching (PNF) intervention group: 4 repetitions of PNF stretching (5seconds of contraction, 5 second of relaxation, 20 second stretch), Roller-Foam intervention group: 2 minutes of Roller-Foam was realized on the hamstring muscles. Main outcome measures: hamstring muscles flexibility was assessed at the beginning, during (30’’ of intervention) and the end of the session by using the Modified Sit and Reach test (MSR). Results: The baseline results data given in both groups are comparable to each other. The PNF group obtained an increase in flexibility of 3,1 cm at 30 seconds (first series) and of 5,1 cm at 2 minutes (the last of all series). The RF group obtained a 0,6 cm difference at 30 seconds and 2,4 cm after 2 minutes of application of roller foam. The results were statistically significant when comparing intragroups but not intergroups. Conclusions: Despite the fact that the use of roller foam is spreading in the sports and rehabilitation field, the results of the present study suggest that the gain of flexibility on the hamstrings is greater if PNF type stretches are used instead of RF. These results may be due to the fact that the use of roller foam intervened more in the fascial tissue, while the stretches intervene more in the myotendinous unit. Future studies are needed, increasing the sample number and diversifying the types of stretching.

Keywords: hamstring muscle, stretching, neuromuscular facilitation stretching, roller foam

Procedia PDF Downloads 170
486 Gender, Climate Change, and Resilience in Kenyan Pastoralist Communities

Authors: Anne Waithira Dormal

Abstract:

Climate change is threatening pastoral livelihoods in Kajiado County, Kenya, through water shortages, livestock deaths, and increasing poverty. This study examines how these impacts differ for men and women within these communities. Limited access to resources, limited land and livestock rights, and limited decision-making power increase women's vulnerability, which is further burdened by traditional gender roles in water procurement. The research recognizes the complexity of climate change and emphasizes that factors such as wealth, family dynamics, and socioeconomic status also influence resilience. Effective adaptation strategies must address all genders. While livestock farming provides a safety net, socioeconomic empowerment through access to credit, healthcare, and education strengthens entire communities. An intersectional perspective that takes ethnicity, social status, and other factors into account is also crucial. This research, therefore, aims to examine how gender-specific adaptation strategies interact with gender and socioeconomic factors to determine the resilience of these Kenyan pastoralist communities. Such strategies, which address the specific needs and vulnerabilities of men and women, are expected to lead to increased resilience to climate change. The aim of the study is to identify effective, gender-specific adaptation strategies that can be integrated into climate change planning and implementation. Additionally, research awaits a deeper understanding of how socioeconomic factors interact with gender to influence vulnerability and resilience within these communities. The study uses a gender-sensitive qualitative approach with focus group discussions in four different pastoral and agropastoral communities. Both qualitative and demographic data are used to capture sources of income, education level, and household size of focus group respondents to increase the power of the analysis. While the research acknowledges the limitations of specific focus sites and potential biases in self-reporting, it offers valuable insights into gender and climate change in pastoral contexts. This study contributes to understanding gender-based vulnerabilities and building resilience in these communities.

Keywords: climate adaptation strategies, climate change, climate resilience, gendered vulnerability, pastoralism

Procedia PDF Downloads 12
485 Assessment of Indigenous People Living Condition in Coal Mining Region: An Evidence from Dhanbad, India

Authors: Arun Kumar Yadav

Abstract:

Coal contributes a significant role in India’s developmental mission. But, ironically, on the other side it causes large scale population displacement and significant changes in indigenous people’s livelihood mechanism. Dhanbad which is regarded as one of the oldest and large mining area, as well as a “Coal Capital of India”. Here, mining exploration work started nearly a century ago. But with the passage of time, mining brings a lot of changes in the life of local people. In this context, study tries to do comparative situational analysis of the changes in the living condition of dwellers living in mines affected and non-mines affected villages based on livelihood approach. Since, this place has long history of mining so it is very difficult to conduct before and after comparison between mines and non-mines affected areas. Consequently, the present study is based on relative comparison approach to elucidate the actual scenario. By using primary survey data which was collected by the author during the month of September 2014 to March 2015 at Dhanbad, Jharkhand. The data were collected from eight villages, these were categorised broadly into mines and non-mines affected villages. Further at micro level, mines affected villages has been categorised into open cast and underground mines. This categorization will help us to capture the deeper understanding about the issues of mine affected villages group. Total of 400 household were surveyed. Result depicts that in every sphere mining affected villages are more vulnerable. Regarding financial capital, although mine affected villages are engaged in mining work and get higher mean income. But in contrast, non-mine affected villages are more occupationally diversified. They have an opportunity to earn money from diversified extents like agricultural land, working in mining area, selling coal informally as well as receiving remittances. Non-mines affected villages are in better physical capital which comprises of basic infrastructure to support livelihood. They have an access to secured shelter, adequate water supply & sanitation, and affordable information and transport. Mining affected villages are more prone to health risks. Regarding social capital, it shows that in comparison to last five years, law and order has been improved in mine affected villages.

Keywords: displacement, indigenous, livelihood, mining

Procedia PDF Downloads 291
484 Specification and Unification of All Fundamental Forces Exist in Universe in the Theoretical Perspective – The Universal Mechanics

Authors: Surendra Mund

Abstract:

At the beginning, the physical entity force was defined mathematically by Sir Isaac Newton in his Principia Mathematica as F ⃗=(dp ⃗)/dt in form of his second law of motion. Newton also defines his Universal law of Gravitational force exist in same outstanding book, but at the end of 20th century and beginning of 21st century, we have tried a lot to specify and unify four or five Fundamental forces or Interaction exist in universe, but we failed every time. Usually, Gravity creates problems in this unification every single time, but in my previous papers and presentations, I defined and derived Field and force equations for Gravitational like Interactions for each and every kind of central systems. This force is named as Variational Force by me, and this force is generated by variation in the scalar field density around the body. In this particular paper, at first, I am specifying which type of Interactions are Fundamental in Universal sense (or in all type of central systems or bodies predicted by my N-time Inflationary Model of Universe) and then unify them in Universal framework (defined and derived by me as Universal Mechanics in a separate paper) as well. This will also be valid in Universal dynamical sense which includes inflations and deflations of universe, central system relativity, Universal relativity, ϕ-ψ transformation and transformation of spin, physical perception principle, Generalized Fundamental Dynamical Law and many other important Generalized Principles of Generalized Quantum Mechanics (GQM) and Central System Theory (CST). So, In this article, at first, I am Generalizing some Fundamental Principles, and then Unifying Variational Forces (General form of Gravitation like Interactions) and Flow Generated Force (General form of EM like Interactions), and then Unify all Fundamental Forces by specifying Weak and Strong Interactions in form of more basic terms - Variational, Flow Generated and Transformational Interactions.

Keywords: Central System Force, Disturbance Force, Flow Generated Forces, Generalized Nuclear Force, Generalized Weak Interactions, Generalized EM-Like Interactions, Imbalance Force, Spin Generated Forces, Transformation Generated Force, Unified Force, Universal Mechanics, Uniform And Non-Uniform Variational Interactions, Variational Interactions

Procedia PDF Downloads 29
483 Non-intrusive Hand Control of Drone Using an Inexpensive and Streamlined Convolutional Neural Network Approach

Authors: Evan Lowhorn, Rocio Alba-Flores

Abstract:

The purpose of this work is to develop a method for classifying hand signals and using the output in a drone control algorithm. To achieve this, methods based on Convolutional Neural Networks (CNN) were applied. CNN's are a subset of deep learning, which allows grid-like inputs to be processed and passed through a neural network to be trained for classification. This type of neural network allows for classification via imaging, which is less intrusive than previous methods using biosensors, such as EMG sensors. Classification CNN's operate purely from the pixel values in an image; therefore they can be used without additional exteroceptive sensors. A development bench was constructed using a desktop computer connected to a high-definition webcam mounted on a scissor arm. This allowed the camera to be pointed downwards at the desk to provide a constant solid background for the dataset and a clear detection area for the user. A MATLAB script was created to automate dataset image capture at the development bench and save the images to the desktop. This allowed the user to create their own dataset of 12,000 images within three hours. These images were evenly distributed among seven classes. The defined classes include forward, backward, left, right, idle, and land. The drone has a popular flip function which was also included as an additional class. To simplify control, the corresponding hand signals chosen were the numerical hand signs for one through five for movements, a fist for land, and the universal “ok” sign for the flip command. Transfer learning with PyTorch (Python) was performed using a pre-trained 18-layer residual learning network (ResNet-18) to retrain the network for custom classification. An algorithm was created to interpret the classification and send encoded messages to a Ryze Tello drone over its 2.4 GHz Wi-Fi connection. The drone’s movements were performed in half-meter distance increments at a constant speed. When combined with the drone control algorithm, the classification performed as desired with negligible latency when compared to the delay in the drone’s movement commands.

Keywords: classification, computer vision, convolutional neural networks, drone control

Procedia PDF Downloads 186
482 Modelling and Simulation of Aero-Elastic Vibrations Using System Dynamic Approach

Authors: Cosmas Pandit Pagwiwoko, Ammar Khaled Abdelaziz Abdelsamia

Abstract:

Flutter as a phenomenon of flow-induced and self-excited vibration has to be recognized considering its harmful effect on the structure especially in a stage of aircraft design. This phenomenon is also important for a wind energy harvester based on the fluttering surface due to its effective operational velocity range. This multi-physics occurrence can be presented by two governing equations in both fluid and structure simultaneously in respecting certain boundary conditions on the surface of the body. In this work, the equations are resolved separately by two distinct solvers, one-time step of each domain. The modelling and simulation of this flow-structure interaction in ANSYS show the effectiveness of this loosely coupled method in representing flutter phenomenon however the process is time-consuming for design purposes. Therefore, another technique using the same weak coupled aero-structure is proposed by using system dynamics approach. In this technique, the aerodynamic forces were calculated using singularity function for a range of frequencies and certain natural mode shapes are transformed into time domain by employing an approximation model of fraction rational function in Laplace variable. The representation of structure in a multi-degree-of-freedom coupled with a transfer function of aerodynamic forces can then be simulated in time domain on a block-diagram platform such as Simulink MATLAB. The dynamic response of flutter at certain velocity can be evaluated with another established flutter calculation in frequency domain k-method. In this method, a parameter of artificial structural damping is inserted in the equation of motion to assure the energy balance of flow and vibrating structure. The simulation in time domain is particularly interested as it enables to apply the structural non-linear factors accurately. Experimental tests on a fluttering airfoil in the wind tunnel are also conducted to validate the method.

Keywords: flutter, flow-induced vibration, flow-structure interaction, non-linear structure

Procedia PDF Downloads 290
481 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network

Procedia PDF Downloads 206
480 Advancing Phenological Understanding of Plants/Trees Through Phenocam Digital Time-lapse Images

Authors: Siddhartha Khare, Suyash Khare

Abstract:

Phenology, a crucial discipline in ecology, offers insights into the seasonal dynamics of organisms within natural ecosystems and the underlying environmental triggers. Leveraging the potent capabilities of digital repeat photography, PhenoCams capture invaluable data on the phenology of crops, plants, and trees. These cameras yield digital imagery in Red Green Blue (RGB) color channels, and some advanced systems even incorporate Near Infrared (NIR) bands. This study presents compelling case studies employing PhenoCam technology to unravel the phenology of black spruce trees. Through the analysis of RGB color channels, a range of essential color metrics including red chromatic coordinate (RCC), green chromatic coordinate (GCC), blue chromatic coordinate (BCC), vegetation contrast index (VCI), and excess green index (ExGI) are derived. These metrics illuminate variations in canopy color across seasons, shedding light on bud and leaf development. This, in turn, facilitates a deeper understanding of phenological events and aids in delineating the growth periods of trees and plants. The initial phase of this study addresses critical questions surrounding the fidelity of continuous canopy greenness records in representing bud developmental phases. Additionally, it discerns which color-based index most accurately tracks the seasonal variations in tree phenology within evergreen forest ecosystems. The subsequent section of this study delves into the transition dates of black spruce (Picea mariana (Mill.) B.S.P.) phenology. This is achieved through a fortnightly comparative analysis of the MODIS normalized difference vegetation index (NDVI) and the enhanced vegetation index (EVI). By employing PhenoCam technology and leveraging advanced color metrics, this study significantly advances our comprehension of black spruce tree phenology, offering valuable insights for ecological research and management.

Keywords: phenology, remote sensing, phenocam, color metrics, NDVI, GCC

Procedia PDF Downloads 39
479 Segmenting 3D Optical Coherence Tomography Images Using a Kalman Filter

Authors: Deniz Guven, Wil Ward, Jinming Duan, Li Bai

Abstract:

Over the past two decades or so, Optical Coherence Tomography (OCT) has been used to diagnose retina and optic nerve diseases. The retinal nerve fibre layer, for example, is a powerful diagnostic marker for detecting and staging glaucoma. With the advances in optical imaging hardware, the adoption of OCT is now commonplace in clinics. More and more OCT images are being generated, and for these OCT images to have clinical applicability, accurate automated OCT image segmentation software is needed. Oct image segmentation is still an active research area, as OCT images are inherently noisy, with the multiplicative speckling noise. Simple edge detection algorithms are unsuitable for detecting retinal layer boundaries in OCT images. Intensity fluctuation, motion artefact, and the presence of blood vessels also decrease further OCT image quality. In this paper, we introduce a new method for segmenting three-dimensional (3D) OCT images. This involves the use of a Kalman filter, which is commonly used in computer vision for object tracking. The Kalman filter is applied to the 3D OCT image volume to track the retinal layer boundaries through the slices within the volume and thus segmenting the 3D image. Specifically, after some pre-processing of the OCT images, points on the retinal layer boundaries in the first image are identified, and curve fitting is applied to them such that the layer boundaries can be represented by the coefficients of the curve equations. These coefficients then form the state space for the Kalman Filter. The filter then produces an optimal estimate of the current state of the system by updating its previous state using the measurements available in the form of a feedback control loop. The results show that the algorithm can be used to segment the retinal layers in OCT images. One of the limitations of the current algorithm is that the curve representation of the retinal layer boundary does not work well when the layer boundary is split into two, e.g., at the optic nerve, the layer boundary split into two. This maybe resolved by using a different approach to representing the boundaries, such as b-splines or level sets. The use of a Kalman filter shows promise to developing accurate and effective 3D OCT segmentation methods.

Keywords: optical coherence tomography, image segmentation, Kalman filter, object tracking

Procedia PDF Downloads 458
478 Effects of the Coagulation Bath and Reduction Process on SO2 Adsorption Capacity of Graphene Oxide Fiber

Authors: Özge Alptoğa, Nuray Uçar, Nilgün Karatepe Yavuz, Ayşen Önen

Abstract:

Sulfur dioxide (SO2) is a very toxic air pollutant gas and it causes the greenhouse effect, photochemical smog, and acid rain, which threaten human health severely. Thus, the capture of SO2 gas is very important for the environment. Graphene which is two-dimensional material has excellent mechanical, chemical, thermal properties, and many application areas such as energy storage devices, gas adsorption, sensing devices, and optical electronics. Further, graphene oxide (GO) is examined as a good adsorbent because of its important features such as functional groups (epoxy, carboxyl and hydroxyl) on the surface and layered structure. The SO2 adsorption properties of the fibers are usually investigated on carbon fibers. In this study, potential adsorption capacity of GO fibers was researched. GO dispersion was first obtained with Hummers’ method from graphite, and then GO fibers were obtained via wet spinning process. These fibers were converted into a disc shape, dried, and then subjected to SO2 gas adsorption test. The SO2 gas adsorption capacity of GO fiber discs was investigated in the fields of utilization of different coagulation baths and reduction by hydrazine hydrate. As coagulation baths, single and triple baths were used. In single bath, only ethanol and CaCl2 (calcium chloride) salt were added. In triple bath, each bath has a different concentration of water/ethanol and CaCl2 salt, and the disc obtained from triple bath has been called as reference disk. The fibers which were produced with single bath were flexible and rough, and the analyses show that they had higher SO2 adsorption capacity than triple bath fibers (reference disk). However, the reduction process did not increase the adsorption capacity, because the SEM images showed that the layers and uniform structure in the fiber form were damaged, and reduction decreased the functional groups which SO2 will be attached. Scanning Electron Microscopy (SEM), Fourier Transform Infrared Spectroscopy (FTIR), X-Ray Diffraction (XRD) analyzes were performed on the fibers and discs, and the effects on the results were interpreted. In the future applications of the study, it is aimed that subjects such as pH and additives will be examined.

Keywords: coagulation bath, graphene oxide fiber, reduction, SO2 gas adsorption

Procedia PDF Downloads 339
477 Short Term Effects of Mobilization with Movement in a Patient with Fibromyalgia: A Case Report

Authors: S. F. Kanaan, Fatima Al-Kadi, H. Khrais

Abstract:

Background: Fibromyalgia is a chronic condition that is characterized by chronic pain that limits physical and functional activities. To our best knowledge, there is currently no key physiotherapy approach recommended to reduce pain and improve function. In addition, there are scarce studies that investigated the effect of manual therapy in the management of Fibromyalgia, and no study investigated the efficacy of Mulligan´s mobilization with movement (MWM) in particular. Methods: A 51-year-old female diagnosed with Fibromyalgia for more than a year. The patient was complaining of generalized pain including neck, lower back, shoulders, elbows, hips, and knees. In addition, the patient reported severe limitation in activities and inability to complete her work as a lawyer. The Intervention provided for the patient consisted of 4 sessions (in two weeks) of MWM for neck, lower back, shoulders, elbows, sacroiliac joint, hips, and knees. The Visual Analogue Scale of pain (VAS), Range of Motion (ROM), 10-minute walk test, Roland Morris Low Back Pain and Disability Questionnaire (RMQ), Disability of the Arm, Shoulder and Hand Score (DASH) were collected at the baseline and at the end of treatment. Results: Average improvement of ROM in the neck, lower back, shoulder, elbows, hips, and knees was 45%. VAS scale changed from pre-treatment to post-treatment as the following: neck pain (9 to 0), lower back pain (8 to 1), shoulders pain (8 to 2), elbows pain (7 to 1), and knees pain (9 to 0). The patient demonstrated improvement in all functional scale from pre-intervention to post-intervention: 10-meter walk test (9.8 to 4.5 seconds), RMQ (21 to 11/24), and DASH (88.7% to 40.5%). The patient did not report any side effect of using this approach. Conclusion: Fibromyalgia can cause joint 'faulty position' leading to pain and dysfunction, which can be reversed by using MWM. MWM showed to have clinically significant improvement in ROM, pain, and ability to walk and a clinically significant reduction in disability in only 4 sessions. This work can be expanded in a larger sample.

Keywords: mobilization, fibromyalgia, dysfunction, manual therapy

Procedia PDF Downloads 148
476 Glyco-Biosensing as a Novel Tool for Prostate Cancer Early-Stage Diagnosis

Authors: Pavel Damborsky, Martina Zamorova, Jaroslav Katrlik

Abstract:

Prostate cancer is annually the most common newly diagnosed cancer among men. An extensive number of evidence suggests that traditional serum Prostate-specific antigen (PSA) assay still suffers from a lack of sufficient specificity and sensitivity resulting in vast over-diagnosis and overtreatment. Thus, the early-stage detection of prostate cancer (PCa) plays undisputedly a critical role for successful treatment and improved quality of life. Over the last decade, particular altered glycans have been described that are associated with a range of chronic diseases, including cancer and inflammation. These glycans differences enable a distinction to be made between physiological and pathological state and suggest a valuable biosensing tool for diagnosis and follow-up purposes. Aberrant glycosylation is one of the major characteristics of disease progression. Consequently, the aim of this study was to develop a more reliable tool for early-stage PCa diagnosis employing lectins as glyco-recognition elements. Biosensor and biochip technology putting to use lectin-based glyco-profiling is one of the most promising strategies aimed at providing fast and efficient analysis of glycoproteins. The proof-of-concept experiments based on sandwich assay employing anti-PSA antibody and an aptamer as a capture molecules followed by lectin glycoprofiling were performed. We present a lectin-based biosensing assay for glycoprofiling of serum biomarker PSA using different biosensor and biochip platforms such as label-free surface plasmon resonance (SPR) and microarray with fluorescent label. The results suggest significant differences in interaction of particular lectins with PSA. The antibody-based assay is frequently associated with the sensitivity, reproducibility, and cross-reactivity issues. Aptamers provide remarkable advantages over antibodies due to the nucleic acid origin, stability and no glycosylation. All these data are further step for construction of highly selective, sensitive and reliable sensors for early-stage diagnosis. The experimental set-up also holds promise for the development of comparable assays with other glycosylated disease biomarkers.

Keywords: biomarker, glycosylation, lectin, prostate cancer

Procedia PDF Downloads 384
475 Application of Neutron-Gamma Technologies for Soil Elemental Content Determination and Mapping

Authors: G. Yakubova, A. Kavetskiy, S. A. Prior, H. A. Torbert

Abstract:

In-situ soil carbon determination over large soil surface areas (several hectares) is required in regard to carbon sequestration and carbon credit issues. This capability is important for optimizing modern agricultural practices and enhancing soil science knowledge. Collecting and processing representative field soil cores for traditional laboratory chemical analysis is labor-intensive and time-consuming. The neutron-stimulated gamma analysis method can be used for in-situ measurements of primary elements in agricultural soils (e.g., Si, Al, O, C, Fe, and H). This non-destructive method can assess several elements in large soil volumes with no need for sample preparation. Neutron-gamma soil elemental analysis utilizes gamma rays issued from different neutron-nuclei interactions. This process has become possible due to the availability of commercial portable pulse neutron generators, high-efficiency gamma detectors, reliable electronics, and measurement/data processing software complimented by advances in state-of-the-art nuclear physics methods. In Pulsed Fast Thermal Neutron Analysis (PFTNA), soil irradiation is accomplished using a pulsed neutron flux, and gamma spectra acquisition occurs both during and between pulses. This method allows the inelastic neutron scattering (INS) gamma spectrum to be separated from the thermal neutron capture (TNC) spectrum. Based on PFTNA, a mobile system for field-scale soil elemental determinations (primarily carbon) was developed and constructed. Our scanning methodology acquires data that can be directly used for creating soil elemental distribution maps (based on ArcGIS software) in a reasonable timeframe (~20-30 hectares per working day). Created maps are suitable for both agricultural purposes and carbon sequestration estimates. The measurement system design, spectra acquisition process, strategy for acquiring field-scale carbon content data, and mapping of agricultural fields will be discussed.

Keywords: neutron gamma analysis, soil elemental content, carbon sequestration, carbon credit, soil gamma spectroscopy, portable neutron generators, ArcMap mapping

Procedia PDF Downloads 71
474 A Multidimensional Indicator-Based Framework to Assess the Sustainability of Productive Green Roofs: A Case Study in Madrid

Authors: Francesca Maria Melucci, Marco Panettieri, Rocco Roma

Abstract:

Cities are at the forefront of achieving the sustainable development goals set out in the Sustainable Development Goals of Agenda 2030. For these reasons, increasing attention has been given to the creation of resilient, sustainable, inclusive and green cities and finding solutions to these problems is one of the greatest challenges faced by researchers today. In particular urban green infrastructures, including green roofs, play a key role in tackling environmental, social and economic problems. The starting point was an extensive literature review on 1. research developments on the benefits (environmental, economic and social) and implications of green roofs; 2. sustainability assessment and applied methodologies; 3. specific indicators to measure impacts on urban sustainability. Through this review, the appropriate qualitative and quantitative characteristics that are part of the complex 'green roof' system were identified, as studies that holistically capture its multifunctional nature are still lacking. So, this paper aims to find a method to improve community participation in green roof initiatives and support local governance processes in developing efficient proposals to achieve better sustainability and resilience of cities. To this aim, the multidimensional indicator-based framework, presented by Tapia in 2021, has been tested for the first time in the case of a green roof in the city of Madrid. The framework's set of indicators was implemented with other indicators such as those of waste management and circularity (OECD Inventory of Circular Economy indicators) and sustainability performance. The specific indicators to be used in the case study were decided after a consultation phase with relevant stakeholders. Data on the community's willingness to participate in green roof implementation initiatives were collected through interviews and online surveys with a heterogeneous sample of citizens. The results of the application of the framework suggest how the different aspects of sustainability influence the choice of a green roof and provide input on the main mechanisms involved in citizens' willingness to participate in such initiatives.

Keywords: urban agriculture, green roof, urban sustainability, indicators, multi-criteria analysis

Procedia PDF Downloads 55
473 Indirect Intergranular Slip Transfer Modeling Through Continuum Dislocation Dynamics

Authors: A. Kalaei, A. H. W. Ngan

Abstract:

In this study, a mesoscopic continuum dislocation dynamics (CDD) approach is applied to simulate the intergranular slip transfer. The CDD scheme applies an efficient kinematics equation to model the evolution of the “all-dislocation density,” which is the line-length of dislocations of each character per unit volume. As the consideration of every dislocation line can be a limiter for the simulation of slip transfer in large scales with a large quantity of participating dislocations, a coarse-grained, extensive description of dislocations in terms of their density is utilized to resolve the effect of collective motion of dislocation lines. For dynamics closure, namely, to obtain the dislocation velocity from a velocity law involving the effective glide stress, mutual elastic interaction of dislocations is calculated using Mura’s equation after singularity removal at the core of dislocation lines. The developed scheme for slip transfer can therefore resolve the effects of the elastic interaction and pile-up of dislocations, which are important physics omitted in coarser models like crystal plasticity finite element methods (CPFEMs). Also, the length and timescales of the simulationareconsiderably larger than those in molecular dynamics (MD) and discrete dislocation dynamics (DDD) models. The present work successfully simulates that, as dislocation density piles up in front of a grain boundary, the elastic stress on the other side increases, leading to dislocation nucleation and stress relaxation when the local glide stress exceeds the operation stress of dislocation sources seeded on the other side of the grain boundary. More importantly, the simulation verifiesa phenomenological misorientation factor often used by experimentalists, namely, the ease of slip transfer increases with the product of the cosines of misorientation angles of slip-plane normals and slip directions on either side of the grain boundary. Furthermore, to investigate the effects of the critical stress-intensity factor of the grain boundary, dislocation density sources are seeded at different distances from the grain boundary, and the critical applied stress to make slip transfer happen is studied.

Keywords: grain boundary, dislocation dynamics, slip transfer, elastic stress

Procedia PDF Downloads 110
472 Corporate Governance of Intellectual Capital: The Impact of Intellectual Capital Reporting

Authors: Cesar Julio Recalde

Abstract:

Background: The role of intangible assets in today´s society is undeniable and continuously growing. More than 80% of corporate market is related to intellectual capital(IC). However, corporate governance principles and practices seem strongly based and oriented towards tangible assets. The impact of intangible assets on corporate governance might require prevention and adaptative actions. Adherence to voluntary mechanisms of intellectual capital reporting (ICR) seems to be a gateway towards adapting corporate governance to intangible assets influence and a conceptual cornerstone. The impact of adherence to intellectual capital reporting on corporate governance and performance needs to be evaluated. Purposes: This work has a sequential two folded purpose: (1) exploring the influences exerted by IC on corporate governance theory and practice, and within that context (2) analyzing the impact of adherence to voluntary mechanisms of ICR on corporate governance. Design and summary: This work employs the theory of the firm and agency theory in order to conceptually explore the effects of each dimension of IC on key corporate governance issues, namely property rights and control by shareholders and residual claims by stakeholders, fiduciary duties of management and the board, opportunistic behavior and transparency. A comprehensive IC taxonomy and map is presented. Within the resulting context, internal and external impact of ICR on corporate governance and performance is conceptually analyzed. IRC constraint and barriers are identified. Intellectual liabilities are presented within the context of IRC. Finally, IRC regulatory framework is surveyed. Findings: Relevant conclusions were rendered on the influence of intellectual capital on corporate governance. Sufficient evidence of a positive impact of IRC on corporate governance and performance was found. Additionally, it was found that IRC exerts a leveraging effect on IC itself. Intellectual liabilities are insufficiently researched and seem to have a relevant importance on IC measuring. IRC regulatory framework was found to be insufficiently developed to capture the essence of intangible assets and to meet corporate governance challenges facing IC. Originality: This work develops a progressive approach to conceptually analyze the mutual influences between IC and corporate governance. An epistemic ideogram represents the intersection of analyzed theories. An IC map is presented. The relatively new topic of intellectual liabilities is conceptually analyzed in the context of IRC. Social liabilities and client liabilities are presented.

Keywords: corporate governance, intellectual capital, intellectual capital reporting, intellectual assets, intellectual liabilities, voluntary mechanisms, regulatory framework

Procedia PDF Downloads 361
471 Estimating CO₂ Storage Capacity under Geological Uncertainty Using 3D Geological Modeling of Unconventional Reservoir Rocks in Block nv32, Shenvsi Oilfield, China

Authors: Ayman Mutahar Alrassas, Shaoran Ren, Renyuan Ren, Hung Vo Thanh, Mohammed Hail Hakimi, Zhenliang Guan

Abstract:

The significant effect of CO₂ on global climate and the environment has gained more concern worldwide. Enhance oil recovery (EOR) associated with sequestration of CO₂ particularly into the depleted oil reservoir is considered the viable approach under financial limitations since it improves the oil recovery from the existing oil reservoir and boosts the relation between global-scale of CO₂ capture and geological sequestration. Consequently, practical measurements are required to attain large-scale CO₂ emission reduction. This paper presents an integrated modeling workflow to construct an accurate 3D reservoir geological model to estimate the storage capacity of CO₂ under geological uncertainty in an unconventional oil reservoir of the Paleogene Shahejie Formation (Es1) in the block Nv32, Shenvsi oilfield, China. In this regard, geophysical data, including well logs of twenty-two well locations and seismic data, were combined with geological and engineering data and used to construct a 3D reservoir geological modeling. The geological modeling focused on four tight reservoir units of the Shahejie Formation (Es1-x1, Es1-x2, Es1-x3, and Es1-x4). The validated 3D reservoir models were subsequently used to calculate the theoretical CO₂ storage capacity in the block Nv32, Shenvsi oilfield. Well logs were utilized to predict petrophysical properties such as porosity and permeability, and lithofacies and indicate that the Es1 reservoir units are mainly sandstone, shale, and limestone with a proportion of 38.09%, 32.42%, and 29.49, respectively. Well log-based petrophysical results also show that the Es1 reservoir units generally exhibit 2–36% porosity, 0.017 mD to 974.8 mD permeability, and moderate to good net to gross ratios. These estimated values of porosity, permeability, lithofacies, and net to gross were up-scaled and distributed laterally using Sequential Gaussian Simulation (SGS) and Simulation Sequential Indicator (SIS) methods to generate 3D reservoir geological models. The reservoir geological models show there are lateral heterogeneities of the reservoir properties and lithofacies, and the best reservoir rocks exist in the Es1-x4, Es1-x3, and Es1-x2 units, respectively. In addition, the reservoir volumetric of the Es1 units in block Nv32 was also estimated based on the petrophysical property models and fund to be between 0.554368

Keywords: CO₂ storage capacity, 3D geological model, geological uncertainty, unconventional oil reservoir, block Nv32

Procedia PDF Downloads 152
470 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 170
469 An Adaptive Conversational AI Approach for Self-Learning

Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo

Abstract:

In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.

Keywords: conversational AI, chatbot, dialog management, semantic analysis

Procedia PDF Downloads 116
468 Knowledge Loss Risk Assessment for Departing Employees: An Exploratory Study

Authors: Muhammad Saleem Ullah Khan Sumbal, Eric Tsui, Ricky Cheong, Eric See To

Abstract:

Organizations are posed to a threat of valuable knowledge loss when employees leave either due to retirement, resignation, job change or because of disabilities e.g. death, etc. Due to changing economic conditions, globalization, and aging workforce, organizations are facing challenges regarding retention of valuable knowledge. On the one hand, large number of employees are going to retire in the organizations whereas on the other hand, younger generation does not want to work in a company for a long time and there is an increasing trend of frequent job change among the new generation. Because of these factors, organizations need to make sure that they capture the knowledge of employee before (s)he walks out of the door. The first step in this process is to know what type of knowledge employee possesses and whether this knowledge is important for the organization. Researchers reveal in the literature that despite the serious consequences of knowledge loss in terms of organizational productivity and competitive advantage, there has not been much work done in the area of knowledge loss assessment of departing employees. An important step in the knowledge retention process is to determine the critical ‘at risk’ knowledge. Thus, knowledge loss risk assessment is a process by which organizations can gauge the importance of knowledge of the departing employee. The purpose of this study is to explore this topic of knowledge loss risk assessment by conducting a qualitative study in oil and gas sector. By engaging in dialogues with managers and executives of the organizations through in-depth interviews and adopting a grounded methodology approach, the research will explore; i) Are there any measures adopted by organizations to assess the risk of knowledge loss from departing employees? ii) Which factors are crucial for knowledge loss assessment in the organizations? iii) How can we prioritize the employees for knowledge retention according to their criticality? Grounded theory approach is used when there is not much knowledge available in the area under research and thus new knowledge is generated about the topic through an in-depth exploration of the topic by using methods such as interviews and using a systematic approach to analyze the data. The outcome of the study will generate a model for the risk of knowledge loss through factors such as the likelihood of knowledge loss, the consequence/impact of knowledge loss and quality of the knowledge loss of departing employees. Initial results show that knowledge loss assessment is quite crucial for the organizations and it helps in determining what types of knowledge employees possess e.g. organizations knowledge, subject matter expertise or relationships knowledge. Based on that, it can be assessed which employee is more important for the organizations and how to prioritize the knowledge retention process for departing employees.

Keywords: knowledge loss, risk assessment, departing employees, Hong Kong organizations

Procedia PDF Downloads 384
467 A Consideration of Dialectal and Stylistic Shifts in Literary Translation

Authors: Pushpinder Syal

Abstract:

Literary writing carries the stamp of the current language of its time. In translating such texts, it becomes a challenge to capture such reflections which may be evident at several levels: the level of dialectal use of language by characters in stories, the alterations in syntax as tools of writers’ individual stylistic choices, the insertion of quasi-proverbial and gnomic utterances, and even the level of the pragmatics of narrative discourse. Discourse strategies may differ between earlier and later texts, reflecting changing relationships between narrators and readers in changed cultural and social contexts. This paper is a consideration of these features by an approach that combines historicity with a description, contextualizing language change within a discourse framework. The process of translating a collection of writings of Punjabi literature spanning 100 years was undertaken for this study and it was observed that the factor of the historicity of language was seen to play a role. While intended for contemporary readers, the translation of literature over the span of a century poses the dual challenge of needing to possess both accessibility and immediacy as well as adherence to the 'old world' styles of communicating and narrating. The linguistic changes may be observed in a more obvious sense in the difference of diction and word formation – with evidence of more hybridized and borrowed forms in modern and contemporary writings, as compared to the older writings. The latter not only contain vestiges of proverbs and folk sayings, but are also closer to oral speech styles. These will be presented and analysed in the form of chronological listing and by these means, the social process of translation from orality to written text can be seen as traceable in the above-mentioned works. More subtle and underlying shifts can be seen through the analysis of speech acts and implicatures in the same literature, in which the social relationships underlying language use are evident as discourse systems of belief and understanding. They present distinct shifts in worldview as seen at different points in time. However, some continuities of language and style are also clearly visible, and these aid the translator in putting together a set of thematic links which identify the literature of a region and community, and constitute essential outcomes in the effort to preserve its distinctive nature.

Keywords: cultural change, dialect, historicity, stylistic variation

Procedia PDF Downloads 112
466 Departing beyond the Orthodoxy: An Integrative Review and Future Research Avenues of Human Capital Resources Theory

Authors: Long Zhang, Ian Hampson, Loretta O' Donnell

Abstract:

Practitioners in various industries, especially in the finance industry that conventionally benefit from financial capital and resources, appear to be increasingly aware of the importance of human capital resources (HCR) after the 2008 Global Financial Crisis. Scholars from diverse fields have conducted extensive and fruitful research on HCR within their own disciplines. This review suggests that the mainstream of pure quantitative research alone is insufficient to provide precise or comprehensive understanding of HCR. The complex relationships and interactions in HCR call for more integrative and cross-disciplinary research to more holistically understand complex and intricate HCRs. The complex nature of HCR requires deep qualitative exploration based on in-depth data to capture the everydayness of organizational activities and to register its individuality and variety. Despite previous efforts, a systematic and holistic integration of HCR research among multiple disciplines is lacking. Using a retrospective analysis of articles published in the field of economics, finance and management, including psychology, human resources management (HRM), organizational behaviour (OB), industrial and organizational psychology (I-O psychology), organizational theory, and strategy literatures, this study summaries and compares the major perspectives, theories, and findings on HCR research. A careful examination of the progress of the debates of HCR definitions and measurements in distinct disciplines enables an identification of the limitations and gaps in existing research. It enables an analysis of the interplay of these concepts, as well as that of the related concepts of intellectual capital, social capital, and Chinese guanxi, and how they provide a broader perspective on the HCR-related influences on firms’ competitive advantage. The study also introduces the themes of Environmental, Social and Governance, or ESG based investing, as the burgeoning body of ESG studies illustrates the rising importance of human and non-financial capital in investment process. The ESG literature locates HCR into a broader research context of the value of non-financial capital in explaining firm performance. The study concludes with a discussion of new directions for future research that may help advance our knowledge of HCR.

Keywords: human capital resources, social capital, Chinese guanxi, human resources management

Procedia PDF Downloads 336
465 Integration of a Microbial Electrolysis Cell and an Oxy-Combustion Boiler

Authors: Ruth Diego, Luis M. Romeo, Antonio Morán

Abstract:

In the present work, a study of the coupling of a Bioelectrochemical System together with an oxy-combustion boiler is carried out; specifically, it proposes to connect the combustion gas outlet of a boiler with a microbial electrolysis cell (MEC) where the CO2 from the gases are transformed into methane in the cathode chamber, and the oxygen produced in the anode chamber is recirculated to the oxy-combustion boiler. The MEC mainly consists of two electrodes (anode and cathode) immersed in an aqueous electrolyte; these electrodes are separated by a proton exchange membrane (PEM). In this case, the anode is abiotic (where oxygen is produced), and it is at the cathode that an electroactive biofilm is formed with microorganisms that catalyze the CO2 reduction reactions. Real data from an oxy-combustion process in a boiler of around 20 thermal MW have been used for this study and are combined with data obtained on a smaller scale (laboratory-pilot scale) to determine the yields that could be obtained considering the system as environmentally sustainable energy storage. In this way, an attempt is made to integrate a relatively conventional energy production system (oxy-combustion) with a biological system (microbial electrolysis cell), which is a challenge to be addressed in this type of new hybrid scheme. In this way, a novel concept is presented with the basic dimensioning of the necessary equipment and the efficiency of the global process. In this work, it has been calculated that the efficiency of this power-to-gas system based on MEC cells when coupled to industrial processes is of the same order of magnitude as the most promising equivalent routes. The proposed process has two main limitations, the overpotentials in the electrodes that penalize the overall efficiency and the need for storage tanks for the process gases. The results of the calculations carried out in this work show that certain real potentials achieve an acceptable performance. Regarding the tanks, with adequate dimensioning, it is possible to achieve complete autonomy. The proposed system called OxyMES provides energy storage without energetically penalizing the process when compared to an oxy-combustion plant with conventional CO2 capture. According to the results obtained, this system can be applied as a measure to decarbonize an industry, changing the original fuel of the oxy-combustion boiler to the biogas generated in the MEC cell. It could also be used to neutralize CO2 emissions from industry by converting it to methane and then injecting it into the natural gas grid.

Keywords: microbial electrolysis cells, oxy-combustion, co2, power-to-gas

Procedia PDF Downloads 82