Search results for: software defined radios
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7525

Search results for: software defined radios

6625 Diagnostics and Explanation of the Current Status of the 40- Year Railway Viaduct

Authors: Jakub Zembrzuski, Bartosz Sobczyk, Mikołaj MIśkiewicz

Abstract:

Besides designing new constructions, engineers all over the world must face another problem – maintenance, repairs, and assessment of the technical condition of existing bridges. To solve more complex issues, it is necessary to be familiar with the theory of finite element method and to have access to the software that provides sufficient tools which to enable create of sometimes significantly advanced numerical models. The paper includes a brief assessment of the technical condition, a description of the in situ non-destructive testing carried out and the FEM models created for global and local analysis. In situ testing was performed using strain gauges and displacement sensors. Numerical models were created using various software and numerical modeling techniques. Particularly noteworthy is the method of modeling riveted joints of the crossbeam of the viaduct. It is a simplified method that consists of the use of only basic numerical tools such as beam and shell finite elements, constraints, and simplified boundary conditions (fixed support and symmetry). The results of the numerical analyses were presented and discussed. It is clearly explained why the structure did not fail, despite the fact that the weld of the deck plate completely failed. A further research problem that was solved was to determine the cause of the rapid increase in values on the stress diagram in the cross-section of the transverse section. The problems were solved using the solely mentioned, simplified method of modeling riveted joints, which demonstrates that it is possible to solve such problems without access to sophisticated software that enables to performance of the advanced nonlinear analysis. Moreover, the obtained results are of great importance in the field of assessing the operation of bridge structures with an orthotropic plate.

Keywords: bridge, diagnostics, FEM simulations, failure, NDT, in situ testing

Procedia PDF Downloads 74
6624 Measured versus Default Interstate Traffic Data in New Mexico, USA

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

This study investigates how the site specific traffic data differs from the Mechanistic Empirical Pavement Design Software default values. Two Weigh-in-Motion (WIM) stations were installed in Interstate-40 (I-40) and Interstate-25 (I-25) to developed site specific data. A computer program named WIM Data Analysis Software (WIMDAS) was developed using Microsoft C-Sharp (.Net) for quality checking and processing of raw WIM data. A complete year data from November 2013 to October 2014 was analyzed using the developed WIM Data Analysis Program. After that, the vehicle class distribution, directional distribution, lane distribution, monthly adjustment factor, hourly distribution, axle load spectra, average number of axle per vehicle, axle spacing, lateral wander distribution, and wheelbase distribution were calculated. Then a comparative study was done between measured data and AASHTOWare default values. It was found that the measured general traffic inputs for I-40 and I-25 significantly differ from the default values.

Keywords: AASHTOWare, traffic, weigh-in-motion, axle load distribution

Procedia PDF Downloads 343
6623 Programming Language Extension Using Structured Query Language for Database Access

Authors: Chapman Eze Nnadozie

Abstract:

Relational databases constitute a very vital tool for the effective management and administration of both personal and organizational data. Data access ranges from a single user database management software to a more complex distributed server system. This paper intends to appraise the use a programming language extension like structured query language (SQL) to establish links to a relational database (Microsoft Access 2013) using Visual C++ 9 programming language environment. The methodology used involves the creation of tables to form a database using Microsoft Access 2013, which is Object Linking and Embedding (OLE) database compliant. The SQL command is used to query the tables in the database for easy extraction of expected records inside the visual C++ environment. The findings of this paper reveal that records can easily be accessed and manipulated to filter exactly what the user wants, such as retrieval of records with specified criteria, updating of records, and deletion of part or the whole records in a table.

Keywords: data access, database, database management system, OLE, programming language, records, relational database, software, SQL, table

Procedia PDF Downloads 187
6622 A Comparison of Alternative Traffic Controls for Interchange Ramp Areas Using Synchro Software

Authors: Mohamed Mesbah, Bruce Janson

Abstract:

An interchange is the most important component of freeway and highway facilities. It is working as a connector between the highway’s elements. The main goal of designing interchanges is to provide an acceptable level of service and delay to make vehicles move smoothly when they are entering and exiting the interchange. There are many factors that can have a significant impact on the level of service; the main factors are traffic volumes, and type of interchange. This paper will discuss interchange with roundabouts under various values of traffic volumes to determine the level of service of the interchanges that will be studied in this paper and replace the system of interchange from roundabout to traffic signal to make a significant compression between these systems. A secondary goal is to propose improvements for scenarios where the level of service is deemed unacceptable. This will be achieved using Synchro traffic simulation software, which facilitates the simulation and optimization of interchanges to enhance operational efficiency and safety.

Keywords: interchange, roundabout, traffic signal, Synchro, delay, level of service, traffic volumes, vehicles, simulation, optimization, adjustment

Procedia PDF Downloads 26
6621 Role of Biomaterial Surface Nanotopography on Protein Unfolding and Immune Response

Authors: Rahul Madathiparambil Visalakshan, Alex Cavallaro, John Hayball, Krasimir Vasilev

Abstract:

The role of biomaterial surface nanotopograhy on fibrinogen adsorption and unfolding, and the subsequent immune response were studied. Inconsistent topography and varying chemical functionalities along with a lack of reproducibility pose a challenge in determining the specific effects of nanotopography or chemistry on proteins and cells. It is important to have a well-defined nanotopography with a homogeneous chemistry to study the real effect of nanotopography on biological systems. Therefore, we developed a technique that can produce well-defined and highly reproducible topography to identify the role of specific roughness, size, height and density with the presence of homogeneous chemical functionality. Using plasma polymerisation of oxazoline monomers and immobilized gold nanoparticles we created surfaces with an equal number density of nanoparticles of different sizes. This surface was used to study the role of surface nanotopography and the interplay of surface chemistry on proteins and immune cells. The effect of nanotopography on fibrinogen adsorption was investigated using Quartz Cristal Microbalance with Dissipation and micro BCA. The mass of fibrinogen adsorbed on the surface increased with increasing size of nano-topography. Protein structural changes up on adsorption to the nano rough surface was studied using circular dichroism spectroscopy. Fibrinogen unfolding varied depending on the specific nanotopography of the surfaces. It was revealed that the in vitro immune response to the nanotopography surfaces changed due to this protein unfolding.

Keywords: biomaterial inflammation, protein and cell responses, protein unfolding, surface nanotopography

Procedia PDF Downloads 176
6620 An Adaptive Virtual Desktop Service in Cloud Computing Platform

Authors: Shuen-Tai Wang, Hsi-Ya Chang

Abstract:

Cloud computing is becoming more and more matured over the last few years and consequently the demands for better cloud services is increasing rapidly. One of the research topics to improve cloud services is the desktop computing in virtualized environment. This paper aims at the development of an adaptive virtual desktop service in cloud computing platform based on our previous research on the virtualization technology. We implement cloud virtual desktop and application software streaming technology that make it possible for providing Virtual Desktop as a Service (VDaaS). Given the development of remote desktop virtualization, it allows shifting the user’s desktop from the traditional PC environment to the cloud-enabled environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote desktop service represents the next significant step to the mobile workplace, and it lets users access their desktop environments from virtually anywhere.

Keywords: cloud computing, virtualization, virtual desktop, VDaaS

Procedia PDF Downloads 286
6619 Philippine Film Industry and Cultural Policy: A Critical Analysis and Case Study

Authors: Michael Kho Lim

Abstract:

This paper examines the status of the film industry as an industry in the Philippines—where or how it is classified in the Philippine industrial classification system and how this positioning gives the film industry an identity (or not) and affects (film) policy development and impacts the larger national economy. It is important to look at how the national government recognises Philippine cinema officially, as this will have a direct and indirect impact on the industry in terms of its representation, conduct of business, international relations, and most especially its implications on policy development and implementation. Therefore, it is imperative that the ‘identity’ of Philippine cinema be clearly established and defined in the overall industrial landscape. Having a clear understanding of Philippine cinema’s industry status provides a better view of the bigger picture and helps us determine cinema’s position in the national agenda in terms of priority setting, future direction and how the state perceives and thereby values the film industry as an industry. This will then serve as a frame of reference that will anchor the succeeding discussion. Once the Philippine film industry status is identified, the paper will then clarify how cultural policy is defined, understood, and applied in the Philippines in relation to Philippine cinema by reviewing and analyzing existing policy documents and pending bills in the Philippine Congress and Senate. Lastly, the paper delves into the roles that (national) cultural institutions and industry organisations play as primary drivers or support mechanisms and how they become platforms (or not) for the upliftment of the independent film sector and towards the sustainability of the film industry. The paper concludes by arguing that the role of the government and how government officials perceive and treats culture is far more important than cultural policy itself, as these policies emanate from them.

Keywords: cultural and creative industries, cultural policy, film industry, Philippine cinema

Procedia PDF Downloads 445
6618 Determination of Temperature Dependent Characteristic Material Properties of Commercial Thermoelectric Modules

Authors: Ahmet Koyuncu, Abdullah Berkan Erdogmus, Orkun Dogu, Sinan Uygur

Abstract:

Thermoelectric modules are integrated to electronic components to keep their temperature in specific values in electronic cooling applications. They can be used in different ambient temperatures. The cold side temperatures of thermoelectric modules depend on their hot side temperatures, operation currents, and heat loads. Performance curves of thermoelectric modules are given at most two different hot surface temperatures in product catalogs. Characteristic properties are required to select appropriate thermoelectric modules in thermal design phase of projects. Generally, manufacturers do not provide characteristic material property values of thermoelectric modules to customers for confidentiality. Common commercial software applied like ANSYS ICEPAK, FloEFD, etc., include thermoelectric modules in their libraries. Therefore, they can be easily used to predict the effect of thermoelectric usage in thermal design. Some software requires only the performance values in different temperatures. However, others like ICEPAK require three temperature-dependent equations for material properties (Seebeck coefficient (α), electrical resistivity (β), and thermal conductivity (γ)). Since the number and the variety of thermoelectric modules are limited in this software, definitions of characteristic material properties of thermoelectric modules could be required. In this manuscript, the method of derivation of characteristic material properties from the datasheet of thermoelectric modules is presented. Material characteristics were estimated from two different performance curves by experimentally and numerically in this study. Numerical calculations are accomplished in ICEPAK by using a thermoelectric module exists in the ICEPAK library. A new experimental setup was established to perform experimental study. Because of similar results of numerical and experimental studies, it can be said that proposed equations are approved. This approximation can be suggested for the analysis includes different type or brand of TEC modules.

Keywords: electrical resistivity, material characteristics, thermal conductivity, thermoelectric coolers, seebeck coefficient

Procedia PDF Downloads 179
6617 An Extended X-Ray Absorption Fine Structure Study of CoTi Thin Films

Authors: Jose Alberto Duarte Moller, Cynthia Deisy Gomez Esparza

Abstract:

The cobalt-titanium system was grown as thin films in an INTERCOVAMEX V3 sputtering system, equipped with four magnetrons assisted by DC pulsed and direct DC. A polished highly oriented (400) silicon wafer was used as substrate and the growing temperature was 500 oC. Xray Absorption Spectroscopy experiments were carried out in the SSRL in the 4-3 beam line. The Extenden X-Ray Absorption Fine Structure spectra have been numerically processed by WINXAS software from the background subtraction until the normalization and FFT adjustment. Analyzing the absorption spectra of cobalt in the CoTi2 phase we can appreciate that they agree in energy with the reference spectra that corresponds to the CoO, which indicates that the valence where upon working is Co2+. The RDF experimental results were then compared with those RDF´s generated theoretically by using FEFF software, from a model compound of CoTi2 phase obtained by XRD. The fitting procedure is a highly iterative process. Fits are also checked in R-space using both the real and imaginary parts of Fourier transform. Finally, the presence of overlapping coordination shells and the correctness of the assumption about the nature of the coordinating atom were checked.

Keywords: XAS, EXAFS, FEFF, CoTi

Procedia PDF Downloads 296
6616 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru

Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar

Abstract:

Nowadays, heritage building information modeling (HBIM) is considered an efficient tool to represent and manage information of cultural heritage (CH). The basis of this tool relies on a 3D model generally obtained from a cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired level of development (LOD), level of information (LOI), grade of generation (GOG), as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit, and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings, and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills, and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models families, respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI, and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources since the BIM software used has a free student license.

Keywords: cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit

Procedia PDF Downloads 145
6615 Design, Modification and Structural Analysis of Bicycle Sprocket Using ANSYS

Authors: Roman Kalvin, Saba Arif, Anam Nadeem, Burhan Ali Ghumman, Juntakan Taweekun

Abstract:

Bicycles are important parts of the transportation industry. In the current world, use of sprocket is very high on bicycles these days. Sprocket and chains are important parts of the transmission of power in the bicycle. However, transmission of power is highly dependent on sprocket design. In conventional bicycles, sprockets are made up of mild steel which undergoes wear and tears with the passage of time due to high pressures applied on it. In the current research, a new sprocket is designed by changing its structure and material to carbon fiber from mild steel. The existing sprocket of a bicycle is compared with the new and modified sprocket design. However, new design has structural and material changes as well. According to the results, in carbon fiber, sprocket deformation is 0.091 mm while sprocket stress value is 371.13N/mm². Also, comparison based analysis is done by physical testing and software analysis. There is 8.1% variation in software and experimental results of steel. Additionally, the difference between both methods comes 8 to 9%. This improved design can be used in future for more durability and long run timings for bicycles.

Keywords: sprocket, mild steel, drafting, stress, deformation

Procedia PDF Downloads 256
6614 Parameter Identification Analysis in the Design of Rock Fill Dams

Authors: G. Shahzadi, A. Soulaimani

Abstract:

This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.

Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS

Procedia PDF Downloads 146
6613 Urban Ecological Interaction: Air, Water, Light and New Transit at the Human Scale of Barcelona’s Superilles

Authors: Philip Speranza

Abstract:

As everyday transit options are shifting from autocentric to pedestrian and bicycle oriented modes for healthy living, downtown streets are becoming more attractive places to live. However, tools and methods to measure the natural environment at the small scale of streets do not exist. Fortunately, a combination of mobile data collection technology and parametric urban design software now allows an interface to relate urban ecological conditions. This paper describes creation of an interactive tool to measure urban phenomena of air, water, and heat/light at the scale of new three-by-three block pedestrianized areas in Barcelona called Superilles. Each Superilla limits transit to the exterior of the blocks and to create more walkable and bikeable interior streets for healthy living. The research will describe the integration of data collection, analysis, and design output via a live interface using parametric software Rhino Grasshopper and the Human User Interface (UI) plugin.

Keywords: transit, urban design, GIS, parametric design, Superilles, Barcelona, urban ecology

Procedia PDF Downloads 248
6612 Development of Automated Quality Management System for the Management of Heat Networks

Authors: Nigina Toktasynova, Sholpan Sagyndykova, Zhanat Kenzhebayeva, Maksat Kalimoldayev, Mariya Ishimova, Irbulat Utepbergenov

Abstract:

Any business needs a stable operation and continuous improvement, therefore it is necessary to constantly interact with the environment, to analyze the work of the enterprise in terms of employees, executives and consumers, as well as to correct any inconsistencies of certain types of processes and their aggregate. In the case of heat supply organizations, in addition to suppliers, local legislation must be considered which often is the main regulator of pricing of services. In this case, the process approach used to build a functional organizational structure in these types of businesses in Kazakhstan is a challenge not only in the implementation, but also in ways of analyzing the employee's salary. To solve these problems, we investigated the management system of heating enterprise, including strategic planning based on the balanced scorecard (BSC), quality management in accordance with the standards of the Quality Management System (QMS) ISO 9001 and analysis of the system based on expert judgment using fuzzy inference. To carry out our work we used the theory of fuzzy sets, the QMS in accordance with ISO 9001, BSC according to the method of Kaplan and Norton, method of construction of business processes according to the notation IDEF0, theory of modeling using Matlab software simulation tools and graphical programming LabVIEW. The results of the work are as follows: We determined possibilities of improving the management of heat-supply plant-based on QMS; after the justification and adaptation of software tool it has been used to automate a series of functions for the management and reduction of resources and for the maintenance of the system up to date; an application for the analysis of the QMS based on fuzzy inference has been created with novel organization of communication software with the application enabling the analysis of relevant data of enterprise management system.

Keywords: balanced scorecard, heat supply, quality management system, the theory of fuzzy sets

Procedia PDF Downloads 368
6611 Generic Model for Timetabling Problems by Integer Linear Programmimg Approach

Authors: Nur Aidya Hanum Aizam, Vikneswary Uvaraja

Abstract:

The agenda of showing the scheduled time for performing certain tasks is known as timetabling. It widely used in many departments such as transportation, education, and production. Some difficulties arise to ensure all tasks happen in the time and place allocated. Therefore, many researchers invented various programming model to solve the scheduling problems from several fields. However, the studies in developing the general integer programming model for many timetabling problems are still questionable. Meanwhile, this thesis describe about creating a general model which solve different types of timetabling problems by considering the basic constraints. Initially, the common basic constraints from five different fields are selected and analyzed. A general basic integer programming model was created and then verified by using the medium set of data obtained randomly which is much similar to realistic data. The mathematical software, AIMMS with CPLEX as a solver has been used to solve the model. The model obtained is significant in solving many timetabling problems easily since it is modifiable to all types of scheduling problems which have same basic constraints.

Keywords: AIMMS mathematical software, integer linear programming, scheduling problems, timetabling

Procedia PDF Downloads 438
6610 The Synthesis and Analysis of Two Long Lasting Phosphorescent Compounds: SrAl2O4: Eu2+, Dy3+

Authors: Ghayah Alsaleem

Abstract:

This research project focussed on specific compounds, whereas a literature review was completed on the broader subject of long-lasting phosphorescence. For the review and subsequent laboratory work, long lasting phosphorescence compounds were defined as materials that have an afterglow decay time greater than a few minutes. The decay time is defined as the time between the end of excitation and the moment the light intensity drops below 0.32mcd/m2. This definition is widely used in industry and in most research studies. The experimental work focused on known long-lasting phosphorescence compounds – strontium aluminate (SrAl2O4: Eu2+, Dy3+). At first, preparation was similar to literary methods. Temperature, dopant levels and mixing methods were then varied in order to expose their effects on long-lasting phosphorescence. The effect of temperature was investigated for SrAl2O4: Eu2+, Dy3+, and resulted in the discovery that 1350°C was the only temperature that the compound could be heated to in the Differential scanning calorimetry (DSC) in order to achieve any phosphorescence. However, no temperatures above 1350°C were investigated. The variation of mixing method and co-dopant level in the strontium aluminate compounds resulted in the finding that the dry mixing method using a Turbula mixer resulted in the longest afterglow. It was also found that an increase of europium inclusion, from 1mol% to 2mol% in these compounds, increased the brightest of the phosphorescence. As this increased batch was mixed using sonication, the phosphorescent time was actually reduced which produced green long-lasting phosphorescence for up to 20 minutes following 30 minutes excitation and 50 minutes when the europium content was doubled and mixed using sonication.

Keywords: long lasting, phosphorescence, excitation, europium

Procedia PDF Downloads 181
6609 R Data Science for Technology Management

Authors: Sunghae Jun

Abstract:

Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.

Keywords: technology management, R system, R data science, statistics, machine learning

Procedia PDF Downloads 458
6608 Design and Assessment of Traffic Management Strategies for Improved Mobility on Major Arterial Roads in Lahore City

Authors: N. Ali, S. Nakayama, H. Yamaguchi, M. Nadeem

Abstract:

Traffic congestion is a matter of prime concern in developing countries. This can be primarily attributed due to poor design practices and biased allocation of resources based on political will neglecting the technical feasibilities in infrastructure design. During the last decade, Lahore has expanded at an unprecedented rate as compared to surrounding cities due to more funding and resource allocation by the previous governments. As a result of this, people from surrounding cities and areas moved to the Lahore city for better opportunities and quality of life. This migration inflow inherited the city with an increased population yielding the inefficiency of the existing infrastructure to accommodate enhanced traffic demand. This leads to traffic congestion on major arterial roads of the city. In this simulation study, a major arterial road was selected to evaluate the performance of the five intersections by changing the geometry of the intersections or signal control type. Simulations were done in two software; Highway Capacity Software (HCS) and Synchro Studio and Sim Traffic Software. Some of the traffic management strategies that were employed include actuated-signal control, semi-actuated signal control, fixed-time signal control, and roundabout. The most feasible solution for each intersection in the above-mentioned traffic management techniques was selected with the least delay time (seconds) and improved Level of Service (LOS). The results showed that Jinnah Hospital Intersection and Akbar Chowk Intersection improved 92.97% and 92.67% in delay time reduction, respectively. These results can be used by traffic planners and policy makers for decision making for the expansion of these intersections keeping in mind the traffic demand in future years.

Keywords: traffic congestion, traffic simulation, traffic management, congestion problems

Procedia PDF Downloads 470
6607 Approximation by Generalized Lupaş-Durrmeyer Operators with Two Parameter α and β

Authors: Preeti Sharma

Abstract:

This paper deals with the Stancu type generalization of Lupaş-Durrmeyer operators. We establish some direct results in the polynomial weighted space of continuous functions defined on the interval [0, 1]. Also, Voronovskaja type theorem is studied.

Keywords: Lupas-Durrmeyer operators, polya distribution, weighted approximation, rate of convergence, modulus of continuity

Procedia PDF Downloads 346
6606 Portable System for the Acquisition and Processing of Electrocardiographic Signals to Obtain Different Metrics of Heart Rate Variability

Authors: Daniel F. Bohorquez, Luis M. Agudelo, Henry H. León

Abstract:

Heart rate variability (HRV) is defined as the temporary variation between heartbeats or RR intervals (distance between R waves in an electrocardiographic signal). This distance is currently a recognized biomarker. With the analysis of the distance, it is possible to assess the sympathetic and parasympathetic nervous systems. These systems are responsible for the regulation of the cardiac muscle. The analysis allows health specialists and researchers to diagnose various pathologies based on this variation. For the acquisition and analysis of HRV taken from a cardiac electrical signal, electronic equipment and analysis software that work independently are currently used. This complicates and delays the process of interpretation and diagnosis. With this delay, the health condition of patients can be put at greater risk. This can lead to an untimely treatment. This document presents a single portable device capable of acquiring electrocardiographic signals and calculating a total of 19 HRV metrics. This reduces the time required, resulting in a timelier intervention. The device has an electrocardiographic signal acquisition card attached to a microcontroller capable of transmitting the cardiac signal wirelessly to a mobile device. In addition, a mobile application was designed to analyze the cardiac waveform. The device calculates the RR and different metrics. The application allows a user to visualize in real-time the cardiac signal and the 19 metrics. The information is exported to a cloud database for remote analysis. The study was performed under controlled conditions in the simulated hospital of the Universidad de la Sabana, Colombia. A total of 60 signals were acquired and analyzed. The device was compared against two reference systems. The results show a strong level of correlation (r > 0.95, p < 0.05) between the 19 metrics compared. Therefore, the use of the portable system evaluated in clinical scenarios controlled by medical specialists and researchers is recommended for the evaluation of the condition of the cardiac system.

Keywords: biological signal análisis, heart rate variability (HRV), HRV metrics, mobile app, portable device.

Procedia PDF Downloads 185
6605 Topology and Shape Optimization of Macpherson Control Arm under Fatigue Loading

Authors: Abolfazl Hosseinpour, Javad Marzbanrad

Abstract:

In this research, the topology and shape optimization of a Macpherson control arm has been accomplished to achieve lighter weight. Present automotive market demands low cost and light weight component to meet the need of fuel efficient and cost effective vehicle. This in turn gives the rise to more effective use of materials for automotive parts which can reduce the mass of vehicle. Since automotive components are under dynamic loads which cause fatigue damage, considering fatigue criteria seems to be essential in designing automotive components. At first, in order to create severe loading condition for control arm, some rough roads are generated through power spectral density. Then, the most critical loading conditions are obtained through multibody dynamics analysis of a full vehicle model. Then, the topology optimization is performed based on fatigue life criterion using HyperMesh software, which resulted to 50 percent mass reduction. In the next step a CAD model is created using CATIA software and shape optimization is performed to achieve accurate dimensions with less mass.

Keywords: topology optimization, shape optimization, fatigue life, MacPherson control arm

Procedia PDF Downloads 317
6604 A Software Tool for Computer Forensic Investigation Using Client-Side Web History Visualization

Authors: Francisca Onaolapo Oladipo, Peter Afam Ugwu

Abstract:

Records of user activities which are valuable for forensic investigation purposes are provided by web browsers -these records in most cases are not in visual formats that are easily understood, thereby requiring some extra processes. This paper describes the implementation of a software tool for client-side web history visualization providing suitable forensic evidence for investigative purposes. Visual C#, Perl and gnuplot were deployed on Windows Operating System (OS) environment to implement the system and the resulting tool parses and transforms a web browser history into a visual format that enables an investigator to quickly and efficiently explore, understand, and interpret the user online activities in the context of a specific investigation. The system was tested using two forensic cases: the client-side web history files generated by Mozilla Firefox browser was extracted using MozillaHistoryView utility, then parsed and visualized using bar and stacked column charts. From the visual representation, results of user web activities across various productive and non-productive websites were obtained.

Keywords: history, forensics, visualization, web activities

Procedia PDF Downloads 298
6603 Integrating the Athena Vortex Lattice Code into a Multivariate Design Synthesis Optimisation Platform in JAVA

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology by Mark Drela allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL, JNI

Procedia PDF Downloads 583
6602 A Proposed Framework for Software Redocumentation Using Distributed Data Processing Techniques and Ontology

Authors: Laila Khaled Almawaldi, Hiew Khai Hang, Sugumaran A. l. Nallusamy

Abstract:

Legacy systems are crucial for organizations, but their intricacy and lack of documentation pose challenges for maintenance and enhancement. Redocumentation of legacy systems is vital for automatically or semi-automatically creating documentation for software lacking sufficient records. It aims to enhance system understandability, maintainability, and knowledge transfer. However, existing redocumentation methods need improvement in data processing performance and document generation efficiency. This stems from the necessity to efficiently handle the extensive and complex code of legacy systems. This paper proposes a method for semi-automatic legacy system re-documentation using semantic parallel processing and ontology. Leveraging parallel processing and ontology addresses current challenges by distributing the workload and creating documentation with logically interconnected data. The paper outlines challenges in legacy system redocumentation and suggests a method of redocumentation using parallel processing and ontology for improved efficiency and effectiveness.

Keywords: legacy systems, redocumentation, big data analysis, parallel processing

Procedia PDF Downloads 48
6601 Experimental Investigation of Low Strength Concrete (LSC) Beams Using Carbon Fiber Reinforce Polymer (CFRP) Wrap

Authors: Furqan Farooq, Arslan Akbar, Sana Gul

Abstract:

Inadequate design of seismic structures and use of Low Strength Concrete (LSC) remains the major aspect of structure failure. Parametric investigation (LSC) beams based on experimental work using externally applied Carbon Fiber Reinforce Polymer (CFRP) warp in flexural behavior is studied. The ambition is to know the behavior of beams under loading condition, and its strengthening enhancement after inducing crack is studied, Moreover comparison of results using abacus software is studied. Results show significant enhancement in load carrying capacity, experimental work is compared with abacus software. The research is based on the conclusion that various existing structure but inadequacy in seismic design could increase the load carrying capacity by applying CFRP techniques, which not only strengthened but also provide them to resist even larger potential earthquake by improving its strength as well as ductility.

Keywords: seismic design, carbon fiber, strengthening, ductility

Procedia PDF Downloads 203
6600 Dynamic Process Model for Designing Smart Spaces Based on Context-Awareness and Computational Methods Principles

Authors: Heba M. Jahin, Ali F. Bakr, Zeyad T. Elsayad

Abstract:

As smart spaces can be defined as any working environment which integrates embedded computers, information appliances and multi-modal sensors to remain focused on the interaction between the users, their activity, and their behavior in the space; hence, smart space must be aware of their contexts and automatically adapt to their changing context-awareness, by interacting with their physical environment through natural and multimodal interfaces. Also, by serving the information used proactively. This paper suggests a dynamic framework through the architectural design process of the space based on the principles of computational methods and context-awareness principles to help in creating a field of changes and modifications. It generates possibilities, concerns about the physical, structural and user contexts. This framework is concerned with five main processes: gathering and analyzing data to generate smart design scenarios, parameters, and attributes; which will be transformed by coding into four types of models. Furthmore, connecting those models together in the interaction model which will represent the context-awareness system. Then, transforming that model into a virtual and ambient environment which represents the physical and real environments, to act as a linkage phase between the users and their activities taking place in that smart space . Finally, the feedback phase from users of that environment to be sure that the design of that smart space fulfill their needs. Therefore, the generated design process will help in designing smarts spaces that can be adapted and controlled to answer the users’ defined goals, needs, and activity.

Keywords: computational methods, context-awareness, design process, smart spaces

Procedia PDF Downloads 334
6599 Study of the Process of Climate Change According to Data Simulation Using LARS-WG Software during 2010-2030: Case Study of Semnan Province

Authors: Leila Rashidian

Abstract:

Temperature rise on Earth has had harmful effects on the Earth's surface and has led to change in precipitation patterns all around the world. The present research was aimed to study the process of climate change according to the data simulation in future and compare these parameters with current situation in the studied stations in Semnan province including Garmsar, Shahrood and Semnan. In this regard, LARS-WG software, HADCM3 model and A2 scenario were used for the 2010-2030 period. In this model, climatic parameters such as maximum and minimum temperature, precipitation and radiation were used daily. The obtained results indicated that there will be a 4.4% increase in precipitation in Semnan province compared with the observed data, and in general, there will be a 1.9% increase in temperature. This temperature rise has significant impact on precipitation patterns. Most of precipitation will be raining (torrential rains in some cases). According to the results, from west to east, the country will experience more temperature rise and will be warmer.

Keywords: climate change, Semnan province, Lars.WG model, climate parameters, HADCM₃ model

Procedia PDF Downloads 253
6598 Site Selection of CNG Station by Using FUZZY-AHP Model (Case Study: Gas Zone 4, Tehran City Iran)

Authors: Hamidrza Joodaki

Abstract:

The most complex issue in urban land use planning is site selection that needs to assess the verity of elements and factors. Multi Criteria Decision Making (MCDM) methods are the best approach to deal with complex problems. In this paper, combination of the analytical hierarchy process (AHP) model and FUZZY logic was used as MCDM methods to select the best site for gas station in the 4th gas zone of Tehran. The first and the most important step in FUZZY-AHP model is selection of criteria and sub-criteria. Population, accessibility, proximity and natural disasters were considered as the main criteria in this study. After choosing the criteria, they were weighted based on AHP by EXPERT CHOICE software, and FUZZY logic was used to enhance accuracy and to approach the reality. After these steps, criteria layers were produced and weighted based on FUZZY-AHP model in GIS. Finally, through ARC GIS software, the layers were integrated and the 4th gas zone in TEHRAN was selected as the best site to locate gas station.

Keywords: multiple criteria decision making (MCDM), analytic hierarchy process (AHP), FUZZY logic, geographic information system (GIS)

Procedia PDF Downloads 363
6597 Secure Text Steganography for Microsoft Word Document

Authors: Khan Farhan Rafat, M. Junaid Hussain

Abstract:

Seamless modification of an entity for the purpose of hiding a message of significance inside its substance in a manner that the embedding remains oblivious to an observer is known as steganography. Together with today's pervasive registering frameworks, steganography has developed into a science that offers an assortment of strategies for stealth correspondence over the globe that must, however, need a critical appraisal from security breach standpoint. Microsoft Word is amongst the preferably used word processing software, which comes as a part of the Microsoft Office suite. With a user-friendly graphical interface, the richness of text editing, and formatting topographies, the documents produced through this software are also most suitable for stealth communication. This research aimed not only to epitomize the fundamental concepts of steganography but also to expound on the utilization of Microsoft Word document as a carrier for furtive message exchange. The exertion is to examine contemporary message hiding schemes from security aspect so as to present the explorative discoveries and suggest enhancements which may serve a wellspring of information to encourage such futuristic research endeavors.

Keywords: hiding information in plain sight, stealth communication, oblivious information exchange, conceal, steganography

Procedia PDF Downloads 243
6596 Cardiothoracic Ratio in Postmortem Computed Tomography: A Tool for the Diagnosis of Cardiomegaly

Authors: Alex Eldo Simon, Abhishek Yadav

Abstract:

This study aimed to evaluate the utility of postmortem computed tomography (CT) and heart weight measurements in the assessment of cardiomegaly in cases of sudden death due to cardiac origin by comparing the results of these two diagnostic methods. The study retrospectively analyzed postmortem computed tomography (PMCT) data from 54 cases of sudden natural death and compared the findings with those of the autopsy. The study involved measuring the cardiothoracic ratio (CTR) from coronal computed tomography (CT) images and determining the actual cardiac weight by weighing the heart during the autopsy. The inclusion criteria for the study were cases of sudden death suspected to be caused by cardiac pathology, while exclusion criteria included death due to unnatural causes such as trauma or poisoning, diagnosed natural causes of death related to organs other than the heart, and cases of decomposition. Sensitivity, specificity, and diagnostic accuracy were calculated, and to evaluate the accuracy of using the cardiothoracic ratio (CTR) to detect an enlarged heart, the study generated receiver operating characteristic (ROC) curves. The cardiothoracic ratio (CTR) is a radiological tool used to assess cardiomegaly by measuring the maximum cardiac diameter in relation to the maximum transverse diameter of the chest wall. The clinically used criteria for CTR have been modified from 0.50 to 0.57 for use in postmortem settings, where abnormalities can be detected by comparing CTR values to this threshold. A CTR value of 0.57 or higher is suggestive of hypertrophy but not conclusive. Similarly, heart weight is measured during the traditional autopsy, and a cardiac weight greater than 450 grams is defined as hypertrophy. Of the 54 cases evaluated, 22 (40.7%) had a cardiothoracic ratio (CTR) ranging from > 0.50 to equal 0.57, and 12 cases (22.2%) had a CTR greater than 0.57, which was defined as hypertrophy. The mean CTR was calculated as 0.52 ± 0.06. Among the 54 cases evaluated, the weight of the heart was measured, and the mean was calculated as 369.4 ± 99.9 grams. Out of the 54 cases evaluated, 12 were found to have hypertrophy as defined by PMCT, while only 9 cases were identified with hypertrophy in traditional autopsy. The sensitivity and specificity of the test were calculated as 55.56% and 84.44%, respectively. The sensitivity of the hypertrophy test was found to be 55.56% (95% CI: 26.66, 81.12¹), the specificity was 84.44% (95% CI: 71.22, 92.25¹), and the diagnostic accuracy was 79.63% (95% CI: 67.1, 88.23¹). The limitation of the study was a low sample size of only 54 cases, which may limit the generalizability of the findings. The comparison of the cardiothoracic ratio with heart weight in this study suggests that PMCT may serve as a screening tool for medico-legal autopsies when performed by forensic pathologists. However, it should be noted that the low sensitivity of the test (55.5%) may limit its diagnostic accuracy, and therefore, further studies with larger sample sizes and more diverse populations are needed to validate these findings.

Keywords: PMCT, virtopsy, CTR, cardiothoracic ratio

Procedia PDF Downloads 81