Search results for: time domain reflectometry (TDR)
18941 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons
Procedia PDF Downloads 39418940 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 66718939 Quantification of Magnetic Resonance Elastography for Tissue Shear Modulus using U-Net Trained with Finite-Differential Time-Domain Simulation
Authors: Jiaying Zhang, Xin Mu, Chang Ni, Jeff L. Zhang
Abstract:
Magnetic resonance elastography (MRE) non-invasively assesses tissue elastic properties, such as shear modulus, by measuring tissue’s displacement in response to mechanical waves. The estimated metrics on tissue elasticity or stiffness have been shown to be valuable for monitoring physiologic or pathophysiologic status of tissue, such as a tumor or fatty liver. To quantify tissue shear modulus from MRE-acquired displacements (essentially an inverse problem), multiple approaches have been proposed, including Local Frequency Estimation (LFE) and Direct Inversion (DI). However, one common problem with these methods is that the estimates are severely noise-sensitive due to either the inverse-problem nature or noise propagation in the pixel-by-pixel process. With the advent of deep learning (DL) and its promise in solving inverse problems, a few groups in the field of MRE have explored the feasibility of using DL methods for quantifying shear modulus from MRE data. Most of the groups chose to use real MRE data for DL model training and to cut training images into smaller patches, which enriches feature characteristics of training data but inevitably increases computation time and results in outcomes with patched patterns. In this study, simulated wave images generated by Finite Differential Time Domain (FDTD) simulation are used for network training, and U-Net is used to extract features from each training image without cutting it into patches. The use of simulated data for model training has the flexibility of customizing training datasets to match specific applications. The proposed method aimed to estimate tissue shear modulus from MRE data with high robustness to noise and high model-training efficiency. Specifically, a set of 3000 maps of shear modulus (with a range of 1 kPa to 15 kPa) containing randomly positioned objects were simulated, and their corresponding wave images were generated. The two types of data were fed into the training of a U-Net model as its output and input, respectively. For an independently simulated set of 1000 images, the performance of the proposed method against DI and LFE was compared by the relative errors (root mean square error or RMSE divided by averaged shear modulus) between the true shear modulus map and the estimated ones. The results showed that the estimated shear modulus by the proposed method achieved a relative error of 4.91%±0.66%, substantially lower than 78.20%±1.11% by LFE. Using simulated data, the proposed method significantly outperformed LFE and DI in resilience to increasing noise levels and in resolving fine changes of shear modulus. The feasibility of the proposed method was also tested on MRE data acquired from phantoms and from human calf muscles, resulting in maps of shear modulus with low noise. In future work, the method’s performance on phantom and its repeatability on human data will be tested in a more quantitative manner. In conclusion, the proposed method showed much promise in quantifying tissue shear modulus from MRE with high robustness and efficiency.Keywords: deep learning, magnetic resonance elastography, magnetic resonance imaging, shear modulus estimation
Procedia PDF Downloads 6818938 Dido: An Automatic Code Generation and Optimization Framework for Stencil Computations on Distributed Memory Architectures
Authors: Mariem Saied, Jens Gustedt, Gilles Muller
Abstract:
We present Dido, a source-to-source auto-generation and optimization framework for multi-dimensional stencil computations. It enables a large programmer community to easily and safely implement stencil codes on distributed-memory parallel architectures with Ordered Read-Write Locks (ORWL) as an execution and communication back-end. ORWL provides inter-task synchronization for data-oriented parallel and distributed computations. It has been proven to guarantee equity, liveness, and efficiency for a wide range of applications, particularly for iterative computations. Dido consists mainly of an implicitly parallel domain-specific language (DSL) implemented as a source-level transformer. It captures domain semantics at a high level of abstraction and generates parallel stencil code that leverages all ORWL features. The generated code is well-structured and lends itself to different possible optimizations. In this paper, we enhance Dido to handle both Jacobi and Gauss-Seidel grid traversals. We integrate temporal blocking to the Dido code generator in order to reduce the communication overhead and minimize data transfers. To increase data locality and improve intra-node data reuse, we coupled the code generation technique with the polyhedral parallelizer Pluto. The accuracy and portability of the generated code are guaranteed thanks to a parametrized solution. The combination of ORWL features, the code generation pattern and the suggested optimizations, make of Dido a powerful code generation framework for stencil computations in general, and for distributed-memory architectures in particular. We present a wide range of experiments over a number of stencil benchmarks.Keywords: stencil computations, ordered read-write locks, domain-specific language, polyhedral model, experiments
Procedia PDF Downloads 12718937 Geographic Information System for District Level Energy Performance Simulations
Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck
Abstract:
The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.Keywords: CityGML, EnergyADE, energy performance simulation, GIS
Procedia PDF Downloads 16818936 Influence оf Viscous Dampers on Seismic Response оf Isolated Bridges Including Soil Structure Interaction
Authors: Marija Vitanova, Aleksandra Bogdanovic, Kemal Edip, Viktor Hristovski, Vlado Micov
Abstract:
Bridges represent critical structures in lifeline systems. They provide reliable modes of transportation, so their failure can seriously obstruct relief and rehabilitation work. Earthquake ground motions can cause significant damages in bridges, so during the strong earthquakes, they can easily collapse. The base isolation technique has been quite effective in seismic response mitigation of the bridges in reducing the piers base shear. The effect of soil structure interaction on the dynamic responses of seismically isolated three span girder bridge with viscous dampers is investigated. Viscous dampers are installed in the mid span of the bridge to control bearing displacement. The soil surrounding the foundation of piers has been analyzed by applying different soil densities in order to consider the soil stiffness. The soil medium has been assumed as a four layered infill as dense and loose medium. The boundaries in the soil medium are considered as infinite elements in order to absorb the radiating waves. The formulation of infinite elements is the same as for the finite elements in addition to the mapping of the domain. Based on the iso-parametric concept, the infinite element in global coordinate is mapped onto an element in local coordinate system. In the formulation of the infinite element, only the positive direction extends to infinity thus allowing the waves to propagate outside of the soil medium. Dynamic analyses for two levels of earthquake intensity are performed in time domain using direct integration method. In order to specify the effects of the SSI, the responses of the isolated and controlled isolated bridges are compared. It is observed that the soil surrounding the piers has significant effects on the bearing displacement of the isolated RC bridges. In addition, it is observed that the seismic responses of isolated RC bridge reduced significantly with the installation of the viscous dampers.Keywords: viscous dampers, reinforced concrete girder bridges, seismic response, SSI
Procedia PDF Downloads 12418935 HRV Analysis Based Arrhythmic Beat Detection Using kNN Classifier
Authors: Onder Yakut, Oguzhan Timus, Emine Dogru Bolat
Abstract:
Health diseases have a vital significance affecting human being's life and life quality. Sudden death events can be prevented owing to early diagnosis and treatment methods. Electrical signals, taken from the human being's body using non-invasive methods and showing the heart activity is called Electrocardiogram (ECG). The ECG signal is used for following daily activity of the heart by clinicians. Heart Rate Variability (HRV) is a physiological parameter giving the variation between the heart beats. ECG data taken from MITBIH Arrhythmia Database is used in the model employed in this study. The detection of arrhythmic heart beats is aimed utilizing the features extracted from the HRV time domain parameters. The developed model provides a satisfactory performance with ~89% accuracy, 91.7 % sensitivity and 85% specificity rates for the detection of arrhythmic beats.Keywords: arrhythmic beat detection, ECG, HRV, kNN classifier
Procedia PDF Downloads 35218934 Simulation of Pedestrian Service Time at Different Delay Times
Authors: Imran Badshah
Abstract:
Pedestrian service time reflects the performance of the facility, and it’s a key parameter to analyze the capability of facilities provided to serve pedestrians. The level of service of pedestrians (LOS) mainly depends on pedestrian time and safety. The pedestrian time utilized by taking a service is mainly influenced by the number of available services and the time utilized by each pedestrian in receiving a service; that is called a delay time. In this paper, we analyzed the simulated pedestrian service time with different delay times. A simulation is performed in AnyLogic by developing a model that reflects the real scenario of pedestrian services such as ticket machine gates at rail stations, airports, shopping malls, and cinema halls. The simulated pedestrian time is determined for various delay values. The simulated result shows how pedestrian time changes with the delay pattern. The histogram and time plot graph of a model gives the mean, maximum and minimum values of the pedestrian time. This study helps us to check the behavior of pedestrian time at various services such as subway stations, airports, shopping malls, and cinema halls.Keywords: agent-based simulation, anylogic model, pedestrian behavior, time delay
Procedia PDF Downloads 21018933 The Effects of a Nursing Dignity Care Program on Patients’ Dignity in Care
Authors: Yea-Pyng Lin
Abstract:
Dignity is a core element of nursing care. Maintaining the dignity of patients is an important issue because the health and recovery of patients can be adversely affected by a lack of dignity in their care. The aim of this study was to explore the effects of a nursing dignity care program upon patients’ dignity in care. A quasi-experimental research design was implemented. Nurses were recruited by purposive sampling, and their patients were recruited by simple random sampling. Nurses in the experimental group received the nursing educational program on dignity care, while nurses in the control group received in-service education as usual. Data were collected via two instruments: the dignity in care scale for nurses and the dignity in care scale to patients, both of which were developed by the researcher. Both questionnaires consisted of three domains: agreement, importance, and frequencies of providing dignity care. A total of 178 nurses in the experimental group and 193 nurses in the control group completed the pretest and the follow-up evaluations at the first month, the third month, and the sixth month. The number of patients who were cared for by the nurses in the experimental group was 94 in the pretest. The number of patients in the post-test at the first, third, and sixth months were 91, 85, and 77, respectively. In the control group, 88 patients completed the II pretest, and 80 filled out the post-test at the first month, 77 at the third, and 74 at the sixth month. The major findings revealed the scores of agreement domain among nurses in the experimental group were found significantly different from those who in the control group at each point of time. The scores of importance domain between these two groups also displayed significant differences at pretest and the first month of post-test. Moreover, the frequencies of proving dignity care to patients were significant at pretest, the third month and sixth month of post-test. However, the experimental group had only significantly different from those who in the control group on the frequencies of receiving dignity care especially in the items of ‘privacy care,’ ‘communication care,’ and ‘emotional care’ for the patients. The results show that the nursing program on dignity care could increase nurses’ dignity care for patients in three domains of agreement, importance, and frequencies of providing dignity care. For patients, only the frequencies of receiving dignity care were significantly increased. Therefore, the nursing program on dignity care could be applicable for nurses’ in-service education and practice to enhance the ability of nurses to care for patient’s dignity.Keywords: nurses, patients, dignity care, quasi-experimental, nursing education
Procedia PDF Downloads 46618932 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows
Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono
Abstract:
A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.Keywords: LES, multi-resolution, ENO, fortran
Procedia PDF Downloads 36518931 Designing a Refractive Index Gas Biosensor Exploiting Defects in Photonic Crystal Core-Shell Rods
Authors: Bilal Tebboub, AmelLabbani
Abstract:
This article introduces a compact sensor based on high-transmission, high-sensitivity two-dimensional photonic crystals. The photonic crystal consists of a square network of silicon rods in the air. The sensor is composed of two waveguide couplers and a microcavity designed for monitoring the percentage of hydrogen in the air and identifying gas types. Through the Finite-Difference Time-Domain (FDTD) method, we demonstrate that the sensor's resonance wavelength is contingent upon changes in the gas refractive index. We analyze transmission spectra, quality factors, and sensor sensitivity. The sensor exhibits a notable quality factor and a sensitivity value of 1374 nm/RIU. Notably, the sensor's compact structure occupies an area of 74.5 μm2, rendering it suitable for integrated optical circuits.Keywords: 2-D photonic crystal, sensitivity, F.D.T.D method, label-free biosensing
Procedia PDF Downloads 9218930 Comparative Performance Analysis of Fiber Delay Line Based Buffer Architectures for Contention Resolution in Optical WDM Networks
Authors: Manoj Kumar Dutta
Abstract:
Wavelength division multiplexing (WDM) technology is the most promising technology for the proper utilization of huge raw bandwidth provided by an optical fiber. One of the key problems in implementing the all-optical WDM network is the packet contention. This problem can be solved by several different techniques. In time domain approach the packet contention can be reduced by incorporating fiber delay lines (FDLs) as optical buffer in the switch architecture. Different types of buffering architectures are reported in literatures. In the present paper a comparative performance analysis of three most popular FDL architectures are presented in order to obtain the best contention resolution performance. The analysis is further extended to consider the effect of different fiber non-linearities on the network performance.Keywords: WDM network, contention resolution, optical buffering, non-linearity, throughput
Procedia PDF Downloads 45118929 Active Contours for Image Segmentation Based on Complex Domain Approach
Authors: Sajid Hussain
Abstract:
The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.Keywords: image segmentation, active contour, level set, Mumford and Shah model
Procedia PDF Downloads 11318928 Revised Tower Earthing Design in High-Voltage Transmission Network for High-Frequency Lightning Condition
Authors: Azwadi Mohamad, Pauzi Yahaya, Nadiah Hudi
Abstract:
Earthing system for high-voltage transmission tower is designed to protect the working personnel and equipments, and to maintain the quality of supply during fault. The existing earthing system for transmission towers in TNB’s system is purposely designed for normal power frequency (low-frequency) fault conditions that take into account the step and touch voltages. This earthing design is found to be inapt for lightning (transient) condition to a certain extent, which involves a high-frequency domain. The current earthing practice of laying the electrodes radially in straight 60 m horizontal lines under the ground, in order to achieve the specified impedance value of less than 10 Ω, was deemed ineffective in reducing the high-frequency impedance. This paper introduces a new earthing design that produces low impedance value at the high-frequency domain, without compromising the performance of low-frequency impedance. The performances of this new earthing design, as well as the existing design, are simulated for various soil resistivity values at varying frequency. The proposed concentrated earthing design is found to possess low TFR value at both low and high-frequency. A good earthing design should have a fine balance between compact and radial electrodes under the ground.Keywords: earthing design, high-frequency, lightning, tower footing impedance
Procedia PDF Downloads 16118927 On the Equalization of Nonminimum Phase Electroacoustic Systems Using Digital Inverse Filters
Authors: Avelino Marques, Diamantino Freitas
Abstract:
Some important electroacoustic systems, like loudspeaker systems, exhibit a nonminimum phase behavior that poses considerable effort when applying advanced digital signal processing techniques, such as linear equalization. In this paper, the position and the number of zeros and poles of the inverse filter, FIR type or IIR type, designed using time domain techniques, are studied, compared and related to the nonminimum phase zeros of system to be equalized. Conclusions about the impact of the position of the system non-minimum phase zeros, on the length/order of the inverse filter and on the delay of the equalized system are outlined as a guide to previously decide which type of filter will be more adequate.Keywords: loudspeaker systems, nonminimum phase system, FIR and IIR filter, delay
Procedia PDF Downloads 7718926 Collocation Method for Coupled System of Boundary Value Problems with Cubic B-Splines
Authors: K. N. S. Kasi Viswanadham
Abstract:
Coupled system of second order linear and nonlinear boundary value problems occur in various fields of Science and Engineering. In the formulation of the problem, any one of 81 possible types of boundary conditions may occur. These 81 possible boundary conditions are written as a combination of four boundary conditions. To solve a coupled system of boundary value problem with these converted boundary conditions, a collocation method with cubic B-splines as basis functions has been developed. In the collocation method, the mesh points of the space variable domain have been selected as the collocation points. The basis functions have been redefined into a new set of basis functions which in number match with the number of mesh points in the space variable domain. The solution of a non-linear boundary value problem has been obtained as the limit of a sequence of solutions of linear boundary value problems generated by quasilinearization technique. Several linear and nonlinear boundary value problems are presented to test the efficiency of the proposed method and found that numerical results obtained by the present method are in good agreement with the exact solutions available in the literature.Keywords: collocation method, coupled system, cubic b-splines, mesh points
Procedia PDF Downloads 20918925 Discrete-Time Bulk Queue with Service Capacity Depending on Previous Service Time
Authors: Yutae Lee
Abstract:
This paper considers a discrete-time bulk-arrival bulkservice queueing system, where service capacity varies depending on the previous service time. By using the generating function technique and the supplementary variable method, we compute the distributions of the queue length at an arbitrary slot boundary and a departure time.Keywords: discrete-time queue, bulk queue, variable service capacity, queue length distribution
Procedia PDF Downloads 47618924 Modelling of a Biomechanical Vertebral System for Seat Ejection in Aircrafts Using Lumped Mass Approach
Authors: R. Unnikrishnan, K. Shankar
Abstract:
In the case of high-speed fighter aircrafts, seat ejection is designed mainly for the safety of the pilot in case of an emergency. Strong windblast due to the high velocity of flight is one main difficulty in clearing the tail of the aircraft. Excessive G-forces generated, immobilizes the pilot from escape. In most of the cases, seats are ejected out of the aircrafts by explosives or by rocket motors attached to the bottom of the seat. Ejection forces are primarily in the vertical direction with the objective of attaining the maximum possible velocity in a specified period of time. The safe ejection parameters are studied to estimate the critical time of ejection for various geometries and velocities of flight. An equivalent analytical 2-dimensional biomechanical model of the human spine has been modelled consisting of vertebrae and intervertebral discs with a lumped mass approach. The 24 vertebrae, which consists of the cervical, thoracic and lumbar regions, in addition to the head mass and the pelvis has been designed as 26 rigid structures and the intervertebral discs are assumed as 25 flexible joint structures. The rigid structures are modelled as mass elements and the flexible joints as spring and damper elements. Here, the motions are restricted only in the mid-sagittal plane to form a 26 degree of freedom system. The equations of motions are derived for translational movement of the spinal column. An ejection force with a linearly increasing acceleration profile is applied as vertical base excitation on to the pelvis. The dynamic vibrational response of each vertebra in time-domain is estimated.Keywords: biomechanical model, lumped mass, seat ejection, vibrational response
Procedia PDF Downloads 23118923 Mapping Feature Models to Code Using a Reference Architecture: A Case Study
Authors: Karam Ignaim, Joao M. Fernandes, Andre L. Ferreira
Abstract:
Mapping the artifacts coming from a set of similar products family developed in an ad-hoc manner to make up the resulting software product line (SPL) plays a key role to maintain the consistency between requirements and code. This paper presents a feature mapping approach that focuses on tracing the artifact coming from the migration process, the current feature model (FM), to the other artifacts of the resulting SPL, the reference architecture, and code. Thus, our approach relates each feature of the current FM to its locations in the implementation code, using the reference architecture as an intermediate artifact (as a centric point) to preserve consistency among them during an SPL evolution. The approach uses a particular artifact (i.e., traceability tree) as a solution for managing the mapping process. Tool support is provided using friendlyMapper. We have evaluated the feature mapping approach and tool support by putting the approach into practice (i.e., conducting a case study) of the automotive domain for Classical Sensor Variants Family at Bosch Car Multimedia S.A. The evaluation reveals that the mapping approach presented by this paper fits the automotive domain.Keywords: feature location, feature models, mapping, software product lines, traceability
Procedia PDF Downloads 12718922 Graphic Procession Unit-Based Parallel Processing for Inverse Computation of Full-Field Material Properties Based on Quantitative Laser Ultrasound Visualization
Authors: Sheng-Po Tseng, Che-Hua Yang
Abstract:
Motivation and Objective: Ultrasonic guided waves become an important tool for nondestructive evaluation of structures and components. Guided waves are used for the purpose of identifying defects or evaluating material properties in a nondestructive way. While guided waves are applied for evaluating material properties, instead of knowing the properties directly, preliminary signals such as time domain signals or frequency domain spectra are first revealed. With the measured ultrasound data, inversion calculation can be further employed to obtain the desired mechanical properties. Methods: This research is development of high speed inversion calculation technique for obtaining full-field mechanical properties from the quantitative laser ultrasound visualization system (QLUVS). The quantitative laser ultrasound visualization system (QLUVS) employs a mirror-controlled scanning pulsed laser to generate guided acoustic waves traveling in a two-dimensional target. Guided waves are detected with a piezoelectric transducer located at a fixed location. With a gyro-scanning of the generation source, the QLUVS has the advantage of fast, full-field, and quantitative inspection. Results and Discussions: This research introduces two important tools to improve the computation efficiency. Firstly, graphic procession unit (GPU) with large amount of cores are introduced. Furthermore, combining the CPU and GPU cores, parallel procession scheme is developed for the inversion of full-field mechanical properties based on the QLUVS data. The newly developed inversion scheme is applied to investigate the computation efficiency for single-layered and double-layered plate-like samples. The computation efficiency is shown to be 80 times faster than unparalleled computation scheme. Conclusions: This research demonstrates a high-speed inversion technique for the characterization of full-field material properties based on quantitative laser ultrasound visualization system. Significant computation efficiency is shown, however not reaching the limit yet. Further improvement can be reached by improving the parallel computation. Utilizing the development of the full-field mechanical property inspection technology, full-field mechanical property measured by non-destructive, high-speed and high-precision measurements can be obtained in qualitative and quantitative results. The developed high speed computation scheme is ready for applications where full-field mechanical properties are needed in a nondestructive and nearly real-time way.Keywords: guided waves, material characterization, nondestructive evaluation, parallel processing
Procedia PDF Downloads 20218921 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall
Authors: Sanjib Kr Pal, S. Bhattacharyya
Abstract:
Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.Keywords: conjugate heat transfer, mixed convection, nano fluid, wall waviness
Procedia PDF Downloads 25418920 Enhanced Tensor Tomographic Reconstruction: Integrating Absorption, Refraction and Temporal Effects
Authors: Lukas Vierus, Thomas Schuster
Abstract:
A general framework is examined for dynamic tensor field tomography within an inhomogeneous medium characterized by refraction and absorption, treated as an inverse source problem concerning the associated transport equation. Guided by Fermat’s principle, the Riemannian metric within the specified domain is determined by the medium's refractive index. While considerable literature exists on the inverse problem of reconstructing a tensor field from its longitudinal ray transform within a static Euclidean environment, limited inversion formulas and algorithms are available for general Riemannian metrics and time-varying tensor fields. It is established that tensor field tomography, akin to an inverse source problem for a transport equation, persists in dynamic scenarios. Framing dynamic tensor tomography as an inverse source problem embodies a comprehensive perspective within this domain. Ensuring well-defined forward mappings necessitates establishing existence and uniqueness for the underlying transport equations. However, the bilinear forms of the associated weak formulations fail to meet the coercivity condition. Consequently, recourse to viscosity solutions is taken, demonstrating their unique existence within suitable Sobolev spaces (in the static case) and Sobolev-Bochner spaces (in the dynamic case), under a specific assumption restricting variations in the refractive index. Notably, the adjoint problem can also be reformulated as a transport equation, with analogous results regarding uniqueness. Analytical solutions are expressed as integrals over geodesics, facilitating more efficient evaluation of forward and adjoint operators compared to solving partial differential equations. Certainly, here's the revised sentence in English: Numerical experiments are conducted using a Nesterov-accelerated Landweber method, encompassing various fields, absorption coefficients, and refractive indices, thereby illustrating the enhanced reconstruction achieved through this holistic modeling approach.Keywords: attenuated refractive dynamic ray transform of tensor fields, geodesics, transport equation, viscosity solutions
Procedia PDF Downloads 5118919 Using Log Files to Improve Work Efficiency
Authors: Salman Hussam
Abstract:
As a monitoring system to manage employees' time and employers' business, this system (logger) will monitor the employees at work and will announce them if they spend too much time on social media (even if they are using proxy it will catch them). In this way, people will spend less time at work and more time with family.Keywords: clients, employees, employers, family, monitoring, systems, social media, time
Procedia PDF Downloads 49318918 Improved Signal-To-Noise Ratio by the 3D-Functionalization of Fully Zwitterionic Surface Coatings
Authors: Esther Van Andel, Stefanie C. Lange, Maarten M. J. Smulders, Han Zuilhof
Abstract:
False outcomes of diagnostic tests are a major concern in medical health care. To improve the reliability of surface-based diagnostic tests, it is of crucial importance to diminish background signals that arise from the non-specific binding of biomolecules, a process called fouling. The aim is to create surfaces that repel all biomolecules except the molecule of interest. This can be achieved by incorporating antifouling protein repellent coatings in between the sensor surface and it’s recognition elements (e.g. antibodies, sugars, aptamers). Zwitterionic polymer brushes are considered excellent antifouling materials, however, to be able to bind the molecule of interest, the polymer brushes have to be functionalized and so far this was only achieved at the expense of either antifouling or binding capacity. To overcome this limitation, we combined both features into one single monomer: a zwitterionic sulfobetaine, ensuring antifouling capabilities, equipped with a clickable azide moiety which allows for further functionalization. By copolymerizing this monomer together with a standard sulfobetaine, the number of azides (and with that the number of recognition elements) can be tuned depending on the application. First, the clickable azido-monomer was synthesized and characterized, followed by copolymerizing this monomer to yield functionalizable antifouling brushes. The brushes were fully characterized using surface characterization techniques like XPS, contact angle measurements, G-ATR-FTIR and XRR. As a proof of principle, the brushes were subsequently functionalized with biotin via strain-promoted alkyne azide click reactions, which yielded a fully zwitterionic biotin-containing 3D-functionalized coating. The sensing capacity was evaluated by reflectometry using avidin and fibrinogen containing protein solutions. The surfaces showed excellent antifouling properties as illustrated by the complete absence of non-specific fibrinogen binding, while at the same time clear responses were seen for the specific binding of avidin. A great increase in signal-to-noise ratio was observed, even when the amount of functional groups was lowered to 1%, compared to traditional modification of sulfobetaine brushes that rely on a 2D-approach in which only the top-layer can be functionalized. This study was performed on stoichiometric silicon nitride surfaces for future microring resonator based assays, however, this methodology can be transferred to other biosensor platforms which are currently being investigated. The approach presented herein enables a highly efficient strategy for selective binding with retained antifouling properties for improved signal-to-noise ratios in binding assays. The number of recognition units can be adjusted to a specific need, e.g. depending on the size of the analyte to be bound, widening the scope of these functionalizable surface coatings.Keywords: antifouling, signal-to-noise ratio, surface functionalization, zwitterionic polymer brushes
Procedia PDF Downloads 30618917 Time Management in the Public Sector in Nigeria
Authors: Sunny Ewankhiwimen Aigbomian
Abstract:
Time, is a scarce resource and in everything we do, time is required to accomplish any given task. The need for this presentation is predicated on the way majority of Nigerian especially in the public sector operators see “Time Management”. Time as resources cannot be regained if lost or managed badly. As a significant aspect of human life it should be handled with diligence and utmost seriousness if the public sector is to function as a coordinated entity. In our homes, private life and offices, we schedule different things to ensure that some things do not go the unexpected. When it comes to service delivery on the part of government, it ought to be more serious because government is all about effect and efficient service delivery and “Time” is a significant variable necessary to successful accomplishment. The need for Nigerian government to re-examine time management in her public sector with a view of repositioning the sector to be able to compete well with other public sectors in the world. The peculiarity of Time management in Public Sector in Nigerian context as examined and some useful recommendations of immerse assistance proffered.Keywords: Nigeria, public sector, time management, task
Procedia PDF Downloads 9918916 A Review of Serious Games Characteristics: Common and Specific Aspects
Authors: B. Ben Amara, H. Mhiri Sellami
Abstract:
Serious games adoption is increasing in multiple fields, including health, education, and business. In the same way, many research studied serious games (SGs) for various purposes such as classification, positive impacts, or learning outcomes. Although most of these research examine SG characteristics (SGCs) for conducting their studies, to author’s best knowledge, there is no consensus about features neither in number not in the description. In this paper, we conduct a literature review to collect essential game attributes regardless of the application areas and the study objectives. Firstly, we aimed to define Common SGCs (CSGCs) that characterize the game aspect, by gathering features having the same meanings. Secondly, we tried to identify specific features related to the application area or to the study purpose as a serious aspect. The findings suggest that any type of SG can be defined by a number of CSGCs depicting the gaming side, such as adaptability and rules. In addition, we outlined a number of specific SGCs describing the 'serious' aspect, including specific needs of the domain and indented outcomes. In conclusion, our review showed that it is possible to bridge the research gap due to the lack of consensus by using CSGCs. Moreover, these features facilitate the design and development of successful serious games in any domain and provide a foundation for further research in this area.Keywords: serious game characteristics, serious games common aspects, serious games features, serious games outcomes
Procedia PDF Downloads 13518915 Combined Safety and Cybersecurity Risk Assessment for Intelligent Distributed Grids
Authors: Anders Thorsén, Behrooz Sangchoolie, Peter Folkesson, Ted Strandberg
Abstract:
As more parts of the power grid become connected to the internet, the risk of cyberattacks increases. To identify the cybersecurity threats and subsequently reduce vulnerabilities, the common practice is to carry out a cybersecurity risk assessment. For safety classified systems and products, there is also a need for safety risk assessments in addition to the cybersecurity risk assessment in order to identify and reduce safety risks. These two risk assessments are usually done separately, but since cybersecurity and functional safety are often related, a more comprehensive method covering both aspects is needed. Some work addressing this has been done for specific domains like the automotive domain, but more general methods suitable for, e.g., intelligent distributed grids, are still missing. One such method from the automotive domain is the Security-Aware Hazard Analysis and Risk Assessment (SAHARA) method that combines safety and cybersecurity risk assessments. This paper presents an approach where the SAHARA method has been modified in order to be more suitable for larger distributed systems. The adapted SAHARA method has a more general risk assessment approach than the original SAHARA. The proposed method has been successfully applied on two use cases of an intelligent distributed grid.Keywords: intelligent distribution grids, threat analysis, risk assessment, safety, cybersecurity
Procedia PDF Downloads 15318914 A Method for Clinical Concept Extraction from Medical Text
Authors: Moshe Wasserblat, Jonathan Mamou, Oren Pereg
Abstract:
Natural Language Processing (NLP) has made a major leap in the last few years, in practical integration into medical solutions; for example, extracting clinical concepts from medical texts such as medical condition, medication, treatment, and symptoms. However, training and deploying those models in real environments still demands a large amount of annotated data and NLP/Machine Learning (ML) expertise, which makes this process costly and time-consuming. We present a practical and efficient method for clinical concept extraction that does not require costly labeled data nor ML expertise. The method includes three steps: Step 1- the user injects a large in-domain text corpus (e.g., PubMed). Then, the system builds a contextual model containing vector representations of concepts in the corpus, in an unsupervised manner (e.g., Phrase2Vec). Step 2- the user provides a seed set of terms representing a specific medical concept (e.g., for the concept of the symptoms, the user may provide: ‘dry mouth,’ ‘itchy skin,’ and ‘blurred vision’). Then, the system matches the seed set against the contextual model and extracts the most semantically similar terms (e.g., additional symptoms). The result is a complete set of terms related to the medical concept. Step 3 –in production, there is a need to extract medical concepts from the unseen medical text. The system extracts key-phrases from the new text, then matches them against the complete set of terms from step 2, and the most semantically similar will be annotated with the same medical concept category. As an example, the seed symptom concepts would result in the following annotation: “The patient complaints on fatigue [symptom], dry skin [symptom], and Weight loss [symptom], which can be an early sign for Diabetes.” Our evaluations show promising results for extracting concepts from medical corpora. The method allows medical analysts to easily and efficiently build taxonomies (in step 2) representing their domain-specific concepts, and automatically annotate a large number of texts (in step 3) for classification/summarization of medical reports.Keywords: clinical concepts, concept expansion, medical records annotation, medical records summarization
Procedia PDF Downloads 13518913 Selection of Green Fluorescent Protein and mCherry Nanobodies Using the Yeast Surface Display Method
Authors: Lavinia Ruta, Ileana Farcasanu
Abstract:
The yeast surface display (YSD) technique enables the expression of proteins on yeast cell surfaces, facilitating the identification and isolation of proteins with targeted binding properties, such as nanobodies. Nanobodies, derived from camelid species, are single-domain antibody fragments renowned for their high affinity and specificity towards target proteins, making them valuable in research and potentially in therapeutics. Their advantages include a compact size (~15 kDa), robust stability, and the ability to target challenging epitopes. The project endeavors to establish and validate a platform for producing Green Fluorescent Protein (GFP) and mCherry nanobodies using the yeast surface display method. mCherry, a prevalent red fluorescent protein sourced from coral species, is commonly utilized as a genetic marker in biological studies due to its vibrant red fluorescence. The GFP-nanobody, a single variable domain of heavy-chain antibodies (VHH), exhibits specific binding to GFP, offering a potent means for isolating and engineering fluorescent protein fusions across various biological research domains. Both GFP and mCherry nanobodies find specific utility in cellular imaging and protein analysis applications.Keywords: YSD, nanobodies, GFP, Saccharomyces cerevisiae
Procedia PDF Downloads 6118912 Resources-Based Ontology Matching to Access Learning Resources
Authors: A. Elbyed
Abstract:
Nowadays, ontologies are used for achieving a common understanding within a user community and for sharing domain knowledge. However, the de-centralized nature of the web makes indeed inevitable that small communities will use their own ontologies to describe their data and to index their own resources. Certainly, accessing to resources from various ontologies created independently is an important challenge for answering end user queries. Ontology mapping is thus required for combining ontologies. However, mapping complete ontologies at run time is a computationally expensive task. This paper proposes a system in which mappings between concepts may be generated dynamically as the concepts are encountered during user queries. In this way, the interaction itself defines the context in which small and relevant portions of ontologies are mapped. We illustrate application of the proposed system in the context of Technology Enhanced Learning (TEL) where learners need to access to learning resources covering specific concepts.Keywords: resources query, ontologies, ontology mapping, similarity measures, semantic web, e-learning
Procedia PDF Downloads 312