Search results for: generalized linear mixed model (GLMM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20933

Search results for: generalized linear mixed model (GLMM)

7403 Proposal for an Inspection Tool for Damaged Structures after Disasters

Authors: Karim Akkouche, Amine Nekmouche, Leyla Bouzid

Abstract:

This study focuses on the development of a multifunctional Expert System (ES) called post-seismic damage inspection tool (PSDIT), a powerful tool which allows the evaluation, the processing, and the archiving of the collected data stock after earthquakes. PSDIT can be operated by two user types; an ordinary user (ingineer, expert, or architect) for the damage visual inspection and an administrative user for updating the knowledge and / or for adding or removing the ordinary user. The knowledge acquisition is driven by a hierarchical knowledge model, the Information from investigation reports and those acquired through feedback from expert / engineer questionnaires are part.

Keywords: .disaster, damaged structures, damage assessment, expert system

Procedia PDF Downloads 75
7402 Student Absenteeism as a Challenge for Inclusion: A Comparative Study of Primary Schools in an Urban City in India

Authors: Deepa Idnani

Abstract:

Attendance is an important factor in school success among children. Studies show that better attendance is related to higher academic achievement for students of all backgrounds, but particularly for children with lower socio-economic status. Beginning from the early years, students who attend school regularly score higher on tests than their peers who are frequently absent. The present study in different types of School In Delhi tries to highlight the impact of student absenteeism and the challenges it poses for the students. The study relies on Lewin ‘Model of Exclusion’ and tries to focus on the analysis of children with special needs and the inclusion and exclusion of students in the school.

Keywords: student absenteeism, pedagogy, learning, right to education act, exclusion

Procedia PDF Downloads 292
7401 Virtual Player for Learning by Observation to Assist Karate Training

Authors: Kazumoto Tanaka

Abstract:

It is well known that sport skill learning is facilitated by video observation of players’ actions in sports. The optimal viewpoint for the observation of actions depends on sport scenes. On the other hand, it is impossible to change viewpoint for the observation in general, because most videos are filmed from fixed points. The study has tackled the problem and focused on karate match as a first step. The study developed a method for observing karate player’s actions from any point of view by using 3D-CG model (i.e. virtual player) obtained from video images, and verified the effectiveness of the method on karate match.

Keywords: computer graphics, karate training, learning by observation, motion capture, virtual player

Procedia PDF Downloads 266
7400 Eating Behaviours in Islam and Mental Health: A Preventative Approach

Authors: Muhammad Rafiq, Lamae Zulfiqar, Nazish Idrees Chaudhary

Abstract:

A growing number of research focuses on healthy and unhealthy eating behaviors and their impact on health. It was intended to study the Islamic point of view on eating behavior, its impact on mental health and preventative strategies in the light of the Quran and Sunnah. Different articles and Islamic sayings related to eating behaviors and mental health were reviewed in detail. It was also revealed scientifically and through Islamic point of view that appropriate quantity, quality and timings of food have positive effects on mental health. Therefore, a 3Rs model of eating behaviors has been proposed.

Keywords: food intake, mental health, quality of food, quantity of food

Procedia PDF Downloads 224
7399 A Design for Application of Mobile Agent Technology to MicroService Architecture

Authors: Masayuki Higashino, Toshiya Kawato, Takao Kawamura

Abstract:

A monolithic service is based on the N-tier architecture in many cases. In order to divide a monolithic service into microservices, it is necessary to redefine a model as a new microservice by extracting and merging existing models across layers. Refactoring a monolithic service into microservices requires advanced technical capabilities, and it is a difficult way. This paper proposes a design and concept to ease the migration of a monolithic service to microservices using the mobile agent technology. Our proposed approach, mobile agents-based design and concept, enables to ease dividing and merging services.

Keywords: mobile agent, microservice, web service, distributed system

Procedia PDF Downloads 153
7398 The Modelling of Real Time Series Data

Authors: Valeria Bondarenko

Abstract:

We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.

Keywords: mathematical model, random process, Wiener process, fractional Brownian motion

Procedia PDF Downloads 349
7397 Assessment of ATC with Shunt FACTS Devices

Authors: Ashwani Kumar, Jitender Kumar

Abstract:

In this paper, an optimal power flow based approach has been applied for multi-transactions deregulated environment for ATC determination with SVC and STATCOM. The main contribution of the paper is (i) OPF based approach for evaluation of ATC with multi-transactions, (ii) ATC enhancement with FACTS devices viz. SVC and STATCOM for intact and line contingency cases, (iii) impact of ZIP load on ATC determination and comparison of ATC obtained with SVC and STATCOM. The results have been determined for intact and line contingency cases taking simultaneous as well as single transaction cases for IEEE 24 bus RTS.

Keywords: available transfer capability, FACTS devices, line contingency, multi-transactions, ZIP load model

Procedia PDF Downloads 583
7396 Dosimetric Comparison among Different Head and Neck Radiotherapy Techniques Using PRESAGE™ Dosimeter

Authors: Jalil ur Rehman, Ramesh C. Tailor, Muhammad Isa Khan, Jahnzeeb Ashraf, Muhammad Afzal, Geofferry S. Ibbott

Abstract:

Purpose: The purpose of this analysis was to investigate dose distribution of different techniques (3D-CRT, IMRT and VMAT) of head and neck cancer using 3-dimensional dosimeter called PRESAGETM Dosimeter. Materials and Methods: Computer tomography (CT) scans of radiological physics center (RPC) head and neck anthropomorphic phantom with both RPC standard insert and PRESAGETM insert were acquired separated with Philipp’s CT scanner and both CT scans were exported via DICOM to the Pinnacle version 9.4 treatment planning system (TPS). Each plan was delivered twice to the RPC phantom first containing the RPC standard insert having TLD and film dosimeters and then again containing the Presage insert having 3-D dosimeter (PRESAGETM) by using a Varian True Beam linear accelerator. After irradiation, the standard insert including point dose measurements (TLD) and planar Gafchromic® EBT film measurement were read using RPC standard procedure. The 3D dose distribution from PRESAGETM was read out with the Duke Midsized optical scanner dedicated to RPC (DMOS-RPC). Dose volume histogram (DVH), mean and maximal doses for organs at risk were calculated and compared among each head and neck technique. The prescription dose was same for all head and neck radiotherapy techniques which was 6.60 Gy/friction. Beam profile comparison and gamma analysis were used to quantify agreements among film measurement, PRESAGETM measurement and calculated dose distribution. Quality assurances of all plans were performed by using ArcCHECK method. Results: VMAT delivered the lowest mean and maximum doses to organ at risk (spinal cord, parotid) than IMRT and 3DCRT. Such dose distribution was verified by absolute dose distribution using thermoluminescent dosimeter (TLD) system. The central axial, sagittal and coronal planes were evaluated using 2D gamma map criteria(± 5%/3 mm) and results were 99.82% (axial), 99.78% (sagital), 98.38% (coronal) for VMAT plan and found the agreement between PRESAGE and pinnacle was better than IMRT and 3D-CRT plan excludes a 7 mm rim at the edge of the dosimeter. Profile showed good agreement for all plans between film, PRESAGE and pinnacle and 3D gamma was performed for PTV and OARs, VMAT and 3DCRT endow with better agreement than IMRT. Conclusion: VMAT delivered lowered mean and maximal doses to organs at risk and better PTV coverage during head and neck radiotherapy. TLD, EBT film and PRESAGETM dosimeters suggest that VMAT was better for the treatment of head and neck cancer than IMRT and 3D-CRT.

Keywords: RPC, 3DCRT, IMRT, VMAT, EBT2 film, TLD, PRESAGETM

Procedia PDF Downloads 378
7395 Development and Validation of a Green Analytical Method for the Analysis of Daptomycin Injectable by Fourier-Transform Infrared Spectroscopy (FTIR)

Authors: Eliane G. Tótoli, Hérida Regina N. Salgado

Abstract:

Daptomycin is an important antimicrobial agent used in clinical practice nowadays, since it is very active against some Gram-positive bacteria that are particularly challenges for the medicine, such as methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococci (VRE). The importance of environmental preservation has receiving special attention since last years. Considering the evident need to protect the natural environment and the introduction of strict quality requirements regarding analytical procedures used in pharmaceutical analysis, the industries must seek environmentally friendly alternatives in relation to the analytical methods and other processes that they follow in their routine. In view of these factors, green analytical chemistry is prevalent and encouraged nowadays. In this context, infrared spectroscopy stands out. This is a method that does not use organic solvents and, although it is formally accepted for the identification of individual compounds, also allows the quantitation of substances. Considering that there are few green analytical methods described in literature for the analysis of daptomycin, the aim of this work was the development and validation of a green analytical method for the quantification of this drug in lyophilized powder for injectable solution, by Fourier-transform infrared spectroscopy (FT-IR). Method: Translucent potassium bromide pellets containing predetermined amounts of the drug were prepared and subjected to spectrophotometric analysis in the mid-infrared region. After obtaining the infrared spectrum and with the assistance of the IR Solution software, quantitative analysis was carried out in the spectral region between 1575 and 1700 cm-1, related to a carbonyl band of the daptomycin molecule, and this band had its height analyzed in terms of absorbance. The method was validated according to ICH guidelines regarding linearity, precision (repeatability and intermediate precision), accuracy and robustness. Results and discussion: The method showed to be linear (r = 0.9999), precise (RSD% < 2.0), accurate and robust, over a concentration range from 0.2 to 0.6 mg/pellet. In addition, this technique does not use organic solvents, which is one great advantage over the most common analytical methods. This fact contributes to minimize the generation of organic solvent waste by the industry and thereby reduces the impact of its activities on the environment. Conclusion: The validated method proved to be adequate to quantify daptomycin in lyophilized powder for injectable solution and can be used for its routine analysis in quality control. In addition, the proposed method is environmentally friendly, which is in line with the global trend.

Keywords: daptomycin, Fourier-transform infrared spectroscopy, green analytical chemistry, quality control, spectrometry in IR region

Procedia PDF Downloads 372
7394 Techno-Economic Assessments of Promising Chemicals from a Sugar Mill Based Biorefinery

Authors: Kathleen Frances Haigh, Mieke Nieder-Heitmann, Somayeh Farzad, Mohsen Ali Mandegari, Johann Ferdinand Gorgens

Abstract:

Lignocellulose can be converted to a range of biochemicals and biofuels. Where this is derived from agricultural waste, issues of competition with food are virtually eliminated. One such source of lignocellulose is the South African sugar industry. Lignocellulose could be accessed by changes to the current farming practices and investments in more efficient boilers. The South African sugar industry is struggling due to falling sugar prices and increasing costs and it is proposed that annexing a biorefinery to a sugar mill will broaden the product range and improve viability. Process simulations of the selected chemicals were generated using Aspen Plus®. It was envisaged that a biorefinery would be annexed to a typical South African sugar mill. Bagasse would be diverted from the existing boilers to the biorefinery and mixed with harvest residues. This biomass would provide the feedstock for the biorefinery and the process energy for the biorefinery and sugar mill. Thus, in all scenarios a portion of the biomass was diverted to a new efficient combined heat and power plant (CHP). The Aspen Plus® simulations provided the mass and energy balance data to carry out an economic assessment of each scenarios. The net present value (NPV), internal rate of return (IRR) and minimum selling price (MSP) was calculated for each scenario. As a starting point scenarios were generated to investigate the production of ethanol, ethanol and lactic acid, ethanol and furfural, butanol, methanol, and Fischer-Tropsch syncrude. The bypass to the CHP plant is a useful indicator of the energy demands of the chemical processes. An iterative approach was used to identify a suitable bypass because increasing this value had the combined effect of increasing the amount of energy available and reducing the capacity of the chemical plant. Bypass values ranged from 30% for syncrude production to 50% for combined ethanol and furfural production. A hurdle rate of 15.7% was selected for the IRR. The butanol, combined ethanol and furfural, or the Fischer-Tropsch syncrude scenarios are unsuitable for investment with IRRs of 4.8%, 7.5% and 11.5% respectively. This provides valuable insights into research opportunities. For example furfural from sugarcane bagasse is an established process although the integration of furfural production with ethanol is less well understood. The IRR for the ethanol scenario was 14.7%, which is below the investment criteria, but given the technological maturity it may still be considered for investment. The scenarios which met the investment criteria were the combined ethanol and lactic acid, and the methanol scenarios with IRRs of 20.5% and 16.7%, respectively. These assessments show that the production of biochemicals from lignocellulose can be commercially viable. In addition, this assessment have provided valuable insights for research to improve the commercial viability of additional chemicals and scenarios. This has led to further assessments of the production of itaconic acid, succinic acid, citric acid, xylitol, polyhydroxybutyrate, polyethylene, glucaric acid and glutamic acid.

Keywords: biorefineries, sugar mill, methanol, ethanol

Procedia PDF Downloads 185
7393 Prismatic Bifurcation Study of a Functionally Graded Dielectric Elastomeric Tube Using Linearized Incremental Theory of Deformations

Authors: Sanjeet Patra, Soham Roychowdhury

Abstract:

In recent times, functionally graded dielectric elastomer (FGDE) has gained significant attention within the realm of soft actuation due to its dual capacity to exert highly localized stresses while maintaining its compliant characteristics on application of electro-mechanical loading. Nevertheless, the full potential of dielectric elastomer (DE) has not been fully explored due to their susceptibility to instabilities when subjected to electro-mechanical loads. As a result, study and analysis of such instabilities becomes crucial for the design and realization of dielectric actuators. Prismatic bifurcation is a type of instability that has been recognized in a DE tube. Though several studies have reported on the analysis for prismatic bifurcation in an isotropic DE tube, there is an insufficiency in studies related to prismatic bifurcation of FGDE tubes. Therefore, this paper aims to determine the onset of prismatic bifurcations on an incompressible FGDE tube when subjected to electrical loading across the thickness of the tube and internal pressurization. The analysis has been conducted by imposing two axial boundary conditions on the tube, specifically axially free ends and axially clamped ends. Additionally, the rigidity modulus of the tube has been linearly graded in the direction of thickness where the inner surface of the tube has a lower stiffness than the outer surface. The static equilibrium equations for deformation of the axisymmetric tube are derived and solved using numerical technique. The condition for prismatic bifurcation of the axisymmetric static equilibrium solutions has been obtained by using the linearized incremental constitutive equations. Two modes of bifurcations, corresponding to two different non-circular cross-sectional geometries, have been explored in this study. The outcomes reveal that the FGDE tubes experiences prismatic bifurcation before the Hessian criterion of failure is satisfied. It is observed that the lower mode of bifurcation can be triggered at a lower critical voltage as compared to the higher mode of bifurcation. Furthermore, the tubes with larger stiffness gradient require higher critical voltages for triggering the bifurcation. Moreover, with the increase in stiffness gradient, a linear variation of the critical voltage is observed with the thickness of the tube. It has been found that on applying internal pressure to a tube with low thickness, the tube becomes less susceptible to bifurcations. A thicker tube with axially free end is found to be more stable than the axially clamped end tube at higher mode of bifurcation.

Keywords: critical voltage, functionally graded dielectric elastomer, linearized incremental approach, modulus of rigidity, prismatic bifurcation

Procedia PDF Downloads 69
7392 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 46
7391 Application of Granular Computing Paradigm in Knowledge Induction

Authors: Iftikhar U. Sikder

Abstract:

This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.

Keywords: concept approximation, granular computing, reducts, rough set theory, rule induction

Procedia PDF Downloads 522
7390 A Professional Learning Model for Schools Based on School-University Research Partnering That Is Underpinned and Structured by a Micro-Credentialing Regime

Authors: David Lynch, Jake Madden

Abstract:

There exists a body of literature that reports on the many benefits of partnerships between universities and schools, especially in terms of teaching improvement and school reform. This is because such partnerships can build significant teaching capital, by deepening and expanding the skillsets and mindsets needed to create the connections that support ongoing and embedded teacher professional development and career goals. At the same time, this literature is critical of such initiatives when the partnership outcomes are short- term or one-sided, misaligned to fundamental problems, and not expressly focused on building the desired teaching capabilities. In response to this situation, research conducted by Professor David Lynch and his TeachLab research team, has begun to shed light on the strengths and limitations of school/university partnerships, via the identification of key conceptual elements that appear to act as critical partnership success factors. These elements are theorised as an inter-play between professional knowledge acquisition, readiness, talent management and organisational structure. However, knowledge of how these elements are established, and how they manifest within the school and its teaching workforce as an overall system, remains incomplete. Therefore, research designed to more clearly delineate these elements in relation to their impact on school/university partnerships is thus required. It is within this context that this paper reports on the development and testing of a Professional Learning (PL) model for schools and their teachers that incorporates school-university research partnering within a systematic, whole-of-school PL strategy that is underpinned and structured by a micro-credentialing (MC) regime. MC involves learning a narrow-focused certificate (a micro-credential) in a specific topic area (e.g., 'How to Differentiate Instruction for English as a second language Students') and embedded in the teacher’s day-to-day teaching work. The use of MC is viewed as important to the efficacy and sustainability of teacher PL because it (1) provides an evidence-based framework for teacher learning, (2) has the ability to promote teacher social capital and (3) engender lifelong learning in keeping professional skills current in an embedded and seamless to work manner. The associated research is centred on a primary school in Australia (P-6) that acted as an arena to co-develop, test/investigate and report on outcomes for teacher PL that uses MC to support a whole-of-school partnership with a university.

Keywords: teaching improvement, teacher professional learning, talent management, education partnerships, school-university research

Procedia PDF Downloads 76
7389 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting

Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero

Abstract:

In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling‎) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.

Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling

Procedia PDF Downloads 127
7388 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 135
7387 Application of Signature Verification Models for Document Recognition

Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova

Abstract:

In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.

Keywords: signature recognition, biometric data, artificial intelligence, neural networks

Procedia PDF Downloads 138
7386 Bidirectional Pendulum Vibration Absorbers with Homogeneous Variable Tangential Friction: Modelling and Design

Authors: Emiliano Matta

Abstract:

Passive resonant vibration absorbers are among the most widely used dynamic control systems in civil engineering. They typically consist in a single-degree-of-freedom mechanical appendage of the main structure, tuned to one structural target mode through frequency and damping optimization. One classical scheme is the pendulum absorber, whose mass is constrained to move along a curved trajectory and is damped by viscous dashpots. Even though the principle is well known, the search for improved arrangements is still under way. In recent years this investigation inspired a type of bidirectional pendulum absorber (BPA), consisting of a mass constrained to move along an optimal three-dimensional (3D) concave surface. For such a BPA, the surface principal curvatures are designed to ensure a bidirectional tuning of the absorber to both principal modes of the main structure, while damping is produced either by horizontal viscous dashpots or by vertical friction dashpots, connecting the BPA to the main structure. In this paper, a variant of BPA is proposed, where damping originates from the variable tangential friction force which develops between the pendulum mass and the 3D surface as a result of a spatially-varying friction coefficient pattern. Namely, a friction coefficient is proposed that varies along the pendulum surface in proportion to the modulus of the 3D surface gradient. With such an assumption, the dissipative model of the absorber can be proven to be nonlinear homogeneous in the small displacement domain. The resulting homogeneous BPA (HBPA) has a fundamental advantage over conventional friction-type absorbers, because its equivalent damping ratio results independent on the amplitude of oscillations, and therefore its optimal performance does not depend on the excitation level. On the other hand, the HBPA is more compact than viscously damped BPAs because it does not need the installation of dampers. This paper presents the analytical model of the HBPA and an optimal methodology for its design. Numerical simulations of single- and multi-story building structures under wind and earthquake loads are presented to compare the HBPA with classical viscously damped BPAs. It is shown that the HBPA is a promising alternative to existing BPA types and that homogeneous tangential friction is an effective means to realize systems provided with amplitude-independent damping.

Keywords: amplitude-independent damping, homogeneous friction, pendulum nonlinear dynamics, structural control, vibration resonant absorbers

Procedia PDF Downloads 138
7385 A Survey on Various Technique of Modified TORA over MANET

Authors: Shreyansh Adesara, Sneha Pandiya

Abstract:

The mobile ad-hoc network (MANET) is an important and open area research for the examination and determination of the performance evolution. Temporary ordered routing algorithm (TORA) is adaptable and distributed MANET routing algorithm which is totally dependent on internet MANET Encapsulation protocol (IMEP) for the detection of the link and sensing of the link. If IMEP detect the wrong link failure then the network suffer from congestion and unnecessary route maintenance. Thus, the improvement in link detection method of TORA is introduced by various methods on IMEP by different perspective from different person. There are also different reactive routing protocols like AODV, TORA and DSR has been compared for the knowledge of the routing scenario for different parameter and using different model.

Keywords: IMEP, mobile ad-hoc network, protocol, TORA

Procedia PDF Downloads 436
7384 Mindset Change: Unlocking the Potential for Community-Based Rural Development in Uganda

Authors: Daisy Owomugasho Ndikuno

Abstract:

The paper explores the extent to which mindset change has been critical in the community rural development in Uganda. It is descriptive research with The Parish Development Model as a case study. The results show that rural community development is possible and its success largely depends on harnessing local resources and knowledge; leveraging education, empowerment and awareness; creating sustainable livelihoods and encouraging entrepreneurship and innovation; access to financial resources; and building collaborative networks and partnerships. In all these, the role of mindset change is critical. By instilling a positive, collaborative and innovative mindset, rural communities can overcome challenges and chat a path towards sustainable development.

Keywords: community, development, mindset, change

Procedia PDF Downloads 62
7383 Detection, Isolation, and Raman Spectroscopic Characterization of Acute and Chronic Staphylococcus aureus Infection in an Endothelial Cell Culture Model

Authors: Astrid Tannert, Anuradha Ramoji, Christina Ebert, Frederike Gladigau, Lorena Tuchscherr, Jürgen Popp, Ute Neugebauer

Abstract:

Staphylococcus aureus is a facultative intracellular pathogen, which by entering host cells may evade immunologic host response as well as antimicrobial treatment. In that way, S. aureus can cause persistent intracellular infections which are difficult to treat. Depending on the strain, S. aureus may persist at different intracellular locations like the phagolysosome. The first barrier invading pathogens from the blood stream that they have to cross are the endothelial cells lining the inner surface of blood and lymphatic vessels. Upon proceeding from an acute to a chronic infection, intracellular pathogens undergo certain biochemical and structural changes including a deceleration of metabolic processes to adopt for long-term intracellular survival and the development of a special phenotype designated as small colony variant. In this study, the endothelial cell line Ea.hy 926 was used as a model for acute and chronic S. aureus infection. To this end, Ea.hy 926 cells were cultured on QIAscout™ Microraft Arrays, a special graded cell culture substrate that contains around 12,000 microrafts of 200 µm edge length. After attachment to the substrate, the endothelial cells were infected with GFP-expressing S. aureus for 3 weeks. The acute infection and the development of persistent bacteria was followed by confocal laser scanning microscopy, scanning the whole Microraft Array for the presence and for detailed determination of the intracellular location of fluorescent intracellular bacteria every second day. After three weeks of infection representative microrafts containing infected cells, cells with protruded infections and cells that did never show any infection were isolated and fixed for Raman micro-spectroscopic investigation. For comparison, also microrafts with acute infection were isolated. The acquired Raman spectra are correlated with the fluorescence microscopic images to give hints about a) the molecular alterations in endothelial cells during acute and chronic infection compared to non-infected cells, and b) metabolic and structural changes within the pathogen when entering a mode of persistence within host cells. We thank Dr. Ruth Kläver from QIAGEN GmbH for her support regarding QIAscout technology. Financial support by the BMBF via the CSCC (FKZ 01EO1502) and from the DFG via the Jena Biophotonic and Imaging Laboratory (JBIL, FKZ PO 633/29-1, BA 1601/10-1) is highly acknowledged.

Keywords: correlative image analysis, intracellular infection, pathogen-host adaption, Raman micro-spectroscopy

Procedia PDF Downloads 173
7382 Examining the Effects of Ticket Bundling Strategies and Team Identification on Purchase of Hedonic and Utilitarian Options

Authors: Young Ik Suh, Tywan G. Martin

Abstract:

Bundling strategy is a common marketing practice today. In the past decades, both academicians and practitioners have increasingly emphasized the strategic importance of bundling in today’s markets. The reason for increased interest in bundling strategy is that they normally believe that it can significantly increase profits on organization’s sales over time and it is convenient for the customer. However, little efforts has been made on ticket bundling and purchase considerations in hedonic and utilitarian options in sport consumer behavior context. Consumers often face choices between utilitarian and hedonic alternatives in decision making. When consumers purchase certain products, they are only interested in the functional dimensions, which are called utilitarian dimensions. On the other hand, others focus more on hedonic features such as fun, excitement, and pleasure. Thus, the current research examines how utilitarian and hedonic consumption can vary in typical ticket purchasing process. The purpose of this research is to understand the following two research themes: (1) the differential effect of discount framing on ticket bundling: utilitarian and hedonic options and (2) moderating effect of team identification on ticket bundling. In order to test the research hypotheses, an experimental study using a two-way ANOVA, 3 (team identification: low, medium, and high) X 2 (discount frame: ticket bundle sales with utilitarian product, and hedonic product), with mixed factorial design will be conducted to determine whether there is a statistical significance between purchasing intentions of two discount frames of ticket bundle sales within different team identification levels. To compare mean differences among the two different settings, we will create two conditions of ticket bundles: (1) offering a discount on a ticket ($5 off) if they would purchase it along with utilitarian product (e.g., iPhone8 case, t-shirt, cap), and (2) offering a discount on a ticket ($5 off) if they would purchase it along with hedonic product (e.g., pizza, drink, fans featured on big screen). The findings of the current ticket bundling study are expected to have many theoretical and practical contributions and implications by extending the research and literature pertaining to the relationship between team identification and sport consumer behavior. Specifically, this study can provide a reliable and valid framework to understanding the role of team identification as a moderator on behavioral intentions such as purchase intentions. From an academic perspective, the study will be the first known attempt to understand consumer reactions toward different discount frames related to ticket bundling. Even though the game ticket itself is the major commodity of sport event attendance and significantly related to teams’ revenue streams, most recent ticket pricing research has been done in terms of economic or cost-oriented pricing and not from a consumer psychological perspective. For sport practitioners, this study will also provide significant implications. The result will imply that sport marketers may need to develop two different ticketing promotions for loyal fan and non-loyal fans. Since loyal fans concern ticket price than tie-in products when they see ticket bundle sales, advertising campaign should be more focused on discounting ticket price.

Keywords: ticket bundling, hedonic, utilitarian, team identification

Procedia PDF Downloads 159
7381 Optimization for the Hydraulic Clamping System of an Internal Circulation Two-Platen Injection Molding Machine

Authors: Jian Wang, Lu Yang, Jiong Peng

Abstract:

Internal circulation two-platen clamping system for injection molding machine (IMM) has many potential advantages on energy-saving. In order to estimate its properties, experiments in this paper were carried out. Displacement and pressure of the components were measured. In comparison, the model of hydraulic clamping system was established by using AMESim. The related parameters as well as the energy consumption could be calculated. According to the analysis, the hydraulic system was optimized in order to reduce the energy consumption.

Keywords: AMESim, energy-saving, injection molding machine, internal circulation

Procedia PDF Downloads 542
7380 A Human Activity Recognition System Based on Sensory Data Related to Object Usage

Authors: M. Abdullah, Al-Wadud

Abstract:

Sensor-based activity recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.

Keywords: Naïve Bayesian, based classification, activity recognition, sensor data, object-usage model

Procedia PDF Downloads 313
7379 Research on Aerodynamic Brake Device for High-Speed Train

Authors: S. Yun, M. Kwak

Abstract:

This study is about an aerodynamic brake device for a high-speed train. In order to apply an aerodynamic brake device, an influence of the aerodynamic brake device on a high-speed train was studied aerodynamically, acoustically and dynamically. Wind tunnel test was conducted to predict an effect of braking distance reduction with a scale model of 1/30. Aerodynamic drag increases by 244% with a brake panel of a 90 degree angle. Braking distance for an emergency state was predicted to decrease by 13%.

Keywords: aerodynamic brake, braking distance, drag coefficient, high-speed train, wind-tunnel test

Procedia PDF Downloads 313
7378 Geostatistical Analysis of Contamination of Soils in an Urban Area in Ghana

Authors: S. K. Appiah, E. N. Aidoo, D. Asamoah Owusu, M. W. Nuonabuor

Abstract:

Urbanization remains one of the unique predominant factors which is linked to the destruction of urban environment and its associated cases of soil contamination by heavy metals through the natural and anthropogenic activities. These activities are important sources of toxic heavy metals such as arsenic (As), cadmium (Cd), chromium (Cr), copper (Cu), iron (Fe), manganese (Mn), and lead (Pb), nickel (Ni) and zinc (Zn). Often, these heavy metals lead to increased levels in some areas due to the impact of atmospheric deposition caused by their proximity to industrial plants or the indiscriminately burning of substances. Information gathered on potentially hazardous levels of these heavy metals in soils leads to establish serious health and urban agriculture implications. However, characterization of spatial variations of soil contamination by heavy metals in Ghana is limited. Kumasi is a Metropolitan city in Ghana, West Africa and is challenged with the recent spate of deteriorating soil quality due to rapid economic development and other human activities such as “Galamsey”, illegal mining operations within the metropolis. The paper seeks to use both univariate and multivariate geostatistical techniques to assess the spatial distribution of heavy metals in soils and the potential risk associated with ingestion of sources of soil contamination in the Metropolis. Geostatistical tools have the ability to detect changes in correlation structure and how a good knowledge of the study area can help to explain the different scales of variation detected. To achieve this task, point referenced data on heavy metals measured from topsoil samples in a previous study, were collected at various locations. Linear models of regionalisation and coregionalisation were fitted to all experimental semivariograms to describe the spatial dependence between the topsoil heavy metals at different spatial scales, which led to ordinary kriging and cokriging at unsampled locations and production of risk maps of soil contamination by these heavy metals. Results obtained from both the univariate and multivariate semivariogram models showed strong spatial dependence with range of autocorrelations ranging from 100 to 300 meters. The risk maps produced show strong spatial heterogeneity for almost all the soil heavy metals with extremely risk of contamination found close to areas with commercial and industrial activities. Hence, ongoing pollution interventions should be geared towards these highly risk areas for efficient management of soil contamination to avert further pollution in the metropolis.

Keywords: coregionalization, heavy metals, multivariate geostatistical analysis, soil contamination, spatial distribution

Procedia PDF Downloads 289
7377 Two-Stage Flowshop Scheduling with Unsystematic Breakdowns

Authors: Fawaz Abdulmalek

Abstract:

The two-stage flowshop assembly scheduling problem is considered in this paper. There are more than one parallel machines at stage one and an assembly machine at stage two. The jobs will be processed into the flowshop based on Johnson rule and two extensions of Johnson rule. A simulation model of the two-stage flowshop is constructed where both machines at stage one are subject to random failures. Three simulation experiments will be conducted to test the effect of the three job ranking rules on the makespan. Johnson Largest heuristic outperformed both Johnson rule and Johnson Smallest heuristic for two performed experiments for all scenarios where each experiments having five scenarios.

Keywords: flowshop scheduling, random failures, johnson rule, simulation

Procedia PDF Downloads 330
7376 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study

Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski

Abstract:

Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.

Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model

Procedia PDF Downloads 135
7375 Towards a Goal-Question-Metric Based Approach to Assess Social Sustainability of Software Systems

Authors: Rahma Amri, Narjès Bellamine Ben Saoud

Abstract:

Sustainable development or sustainability is one of the most urgent issues in actual debate in almost domains. Particularly the significant way the software pervades our live should make it in the center of sustainability concerns. The social aspects of sustainability haven’t been well studied in the context of software systems and still immature research field that needs more interest among researchers’ community. This paper presents a Goal-Question-Metric based approach to assess social sustainability of software systems. The approach is based on a generic social sustainability model taken from Social sciences.

Keywords: software assessment approach, social sustainability, goal-question-metric paradigm, software project metrics

Procedia PDF Downloads 380
7374 Caged Compounds as Light-Dependent Initiators for Enzyme Catalysis Reactions

Authors: Emma Castiglioni, Nigel Scrutton, Derren Heyes, Alistair Fielding

Abstract:

By using light as trigger, it is possible to study many biological processes, such as the activity of genes, proteins, and other molecules, with precise spatiotemporal control. Caged compounds, where biologically active molecules are generated from an inert precursor upon laser photolysis, offer the potential to initiate such biological reactions with high temporal resolution. As light acts as the trigger for cleaving the protecting group, the ‘caging’ technique provides a number of advantages as it can be intracellular, rapid and controlled in a quantitative manner. We are developing caging strategies to study the catalytic cycle of a number of enzyme systems, such as nitric oxide synthase and ethanolamine ammonia lyase. These include the use of caged substrates, caged electrons and the possibility of caging the enzyme itself. In addition, we are developing a novel freeze-quench instrument to study these reactions, which combines rapid mixing and flashing capabilities. Reaction intermediates will be trapped at low temperatures and will be analysed by using electron paramagnetic resonance (EPR) spectroscopy to identify the involvement of any radical species during catalysis. EPR techniques typically require relatively long measurement times and very often, low temperatures to fully characterise these short-lived species. Therefore, common rapid mixing techniques, such as stopped-flow or quench-flow are not directly suitable. However, the combination of rapid freeze-quench (RFQ) followed by EPR analysis provides the ideal approach to kinetically trap and spectroscopically characterise these transient radical species. In a typical RFQ experiment, two reagent solutions are delivered to the mixer via two syringes driven by a pneumatic actuator or stepper motor. The new mixed solution is then sprayed into a cryogenic liquid or surface, and the frozen sample is then collected and packed into an EPR tube for analysis. The earliest RFQ instrument consisted of a hydraulic ram unit as a drive unit with direct spraying of the sample into a cryogenic liquid (nitrogen, isopentane or petroleum). Improvements to the RFQ technique have arisen from the design of new mixers in order to reduce both the volume and the mixing time. In addition, the cryogenic isopentane bath has been coupled to a filtering system or replaced by spraying the solution onto a surface that is frozen via thermal conductivity with a cryogenic liquid. In our work, we are developing a novel RFQ instrument which combines the freeze-quench technology with flashing capabilities to enable the studies of both thermally-activated and light-activated biological reactions. This instrument also uses a new rotating plate design based on magnetic couplings and removes the need for mechanical motorised rotation, which can otherwise be problematic at cryogenic temperatures.

Keywords: caged compounds, freeze-quench apparatus, photolysis, radicals

Procedia PDF Downloads 203