Search results for: real time PCR
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20566

Search results for: real time PCR

19366 Slope Effect in Emission Evaluation to Assess Real Pollutant Factors

Authors: G. Meccariello, L. Della Ragione

Abstract:

The exposure to outdoor air pollution causes lung cancer and increases the risk of bladder cancer. Because air pollution in urban areas is mainly caused by transportation, it is necessary to evaluate pollutant exhaust emissions from vehicles during their real-world use. Nevertheless their evaluation and reduction is a key problem, especially in the cities, that account for more than 50% of world population. A particular attention was given to the slope variability along the streets during each journey performed by the instrumented vehicle. In this paper we dealt with the problem of describing a quantitatively approach for the reconstruction of GPS coordinates and altitude, in the context of correlation study between driving cycles / emission / geographical location, during an experimental campaign realized with some instrumented cars. Finally the slope analysis can be correlated to the emission and consumption values in a specific road position, and it could be evaluated its influence on their behaviour.

Keywords: air pollution, driving cycles, GPS signal, slope, emission factor, fuel consumption

Procedia PDF Downloads 387
19365 Evaluating the Effectiveness of Plantar Sensory Insoles and Remote Patient Monitoring for Early Intervention in Diabetic Foot Ulcer Prevention in Patients with Peripheral Neuropathy

Authors: Brock Liden, Eric Janowitz

Abstract:

Introduction: Diabetic peripheral neuropathy (DPN) affects 70% of individuals with diabetes1. DPN causes a loss of protective sensation, which can lead to tissue damage and diabetic foot ulcer (DFU) formation2. These ulcers can result in infections and lower-extremity amputations of toes, the entire foot, and the lower leg. Even after a DFU is healed, recurrence is common, with 49% of DFU patients developing another ulcer within a year and 68% within 5 years3. This case series examines the use of sensory insoles and newly available plantar data (pressure, temperature, step count, adherence) and remote patient monitoring in patients at risk of DFU. Methods: Participants were provided with custom-made sensory insoles to monitor plantar pressure, temperature, step count, and daily use and were provided with real-time cues for pressure offloading as they went about their daily activities. The sensory insoles were used to track subject compliance, ulceration, and response to feedback from real-time alerts. Patients were remotely monitored by a qualified healthcare professional and were contacted when areas of concern were seen and provided coaching on reducing risk factors and overall support to improve foot health. Results: Of the 40 participants provided with the sensory insole system, 4 presented with a DFU. Based on flags generated from the available plantar data, patients were contacted by the remote monitor to address potential concerns. A standard clinical escalation protocol detailed when and how concerns should be escalated to the provider by the remote monitor. Upon escalation to the provider, patients were brought into the clinic as needed, allowing for any issues to be addressed before more serious complications might arise. Conclusion: This case series explores the use of innovative sensory technology to collect plantar data (pressure, temperature, step count, and adherence) for DFU detection and early intervention. The results from this case series suggest the importance of sensory technology and remote patient monitoring in providing proactive, preventative care for patients at risk of DFU. This robust plantar data, with the addition of remote patient monitoring, allow for patients to be seen in the clinic when concerns arise, giving providers the opportunity to intervene early and prevent more serious complications, such as wounds, from occurring.

Keywords: diabetic foot ulcer, DFU prevention, digital therapeutics, remote patient monitoring

Procedia PDF Downloads 72
19364 Temperature Effect on Changing of Electrical Impedance and Permittivity of Ouargla (Algeria) Dunes Sand at Different Frequencies

Authors: Naamane Remita, Mohammed laïd Mechri, Nouredine Zekri, Smaïl Chihi

Abstract:

The goal of this study is the estimation real and imaginary components of both electrical impedance and permittivity z', z'' and ε', ε'' respectively, in Ouargla dunes sand at different temperatures and different frequencies, with alternating current (AC) equal to 1 volt, using the impedance spectroscopy (IS). This method is simple and non-destructive. the results can frequently be correlated with a number of physical properties, dielectric properties and the impacts of the composition on the electrical conductivity of solids. The experimental results revealed that the real part of impedance is higher at higher temperature in the lower frequency region and gradually decreases with increasing frequency. As for the high frequencies, all the values of the real part of the impedance were positive. But at low frequency the values of the imaginary part were positive at all temperatures except for 1200 degrees which were negative. As for the medium frequencies, the reactance values were negative at temperatures 25, 400, 200 and 600 degrees, and then became positive at the rest of the temperatures. At high frequencies of the order of MHz, the values of the imaginary part of the electrical impedance were in contrast to what we recorded for the middle frequencies. The results showed that the electrical permittivity decreases with increasing frequency, at low frequency we recorded permittivity values of 10+ 11, and at medium frequencies it was 10+ 07, while at high frequencies it was 10+ 02. The values of the real part of the electrical permittivity were taken large values at the temperatures of 200 and 600 degrees Celsius and at the lowest frequency, while the smallest value for the permittivity was recorded at the temperature of 400 degrees Celsius at the highest frequency. The results showed that there are large values of the imaginary part of the electrical permittivity at the lowest frequency and then it starts decreasing as the latter increases (the higher the frequency the lower the values of the imaginary part of the electrical permittivity). The character of electrical impedance variation indicated an opportunity to realize the polarization of Ouargla dunes sand and acquaintance if this compound consumes or produces energy. It’s also possible to know the satisfactory of equivalent electric circuit, whether it’s miles induction or capacitance.

Keywords: electrical impedance, electrical permittivity, temperature, impedance spectroscopy, dunes sand ouargla

Procedia PDF Downloads 41
19363 ANFIS Approach for Locating Faults in Underground Cables

Authors: Magdy B. Eteiba, Wael Ismael Wahba, Shimaa Barakat

Abstract:

This paper presents a fault identification, classification and fault location estimation method based on Discrete Wavelet Transform and Adaptive Network Fuzzy Inference System (ANFIS) for medium voltage cable in the distribution system. Different faults and locations are simulated by ATP/EMTP, and then certain selected features of the wavelet transformed signals are used as an input for a training process on the ANFIS. Then an accurate fault classifier and locator algorithm was designed, trained and tested using current samples only. The results obtained from ANFIS output were compared with the real output. From the results, it was found that the percentage error between ANFIS output and real output is less than three percent. Hence, it can be concluded that the proposed technique is able to offer high accuracy in both of the fault classification and fault location.

Keywords: ANFIS, fault location, underground cable, wavelet transform

Procedia PDF Downloads 504
19362 High Rise Building Vibration Control Using Tuned Mass Damper

Authors: T. Vikneshvaran, A. Aminudin, U. Alyaa Hashim, Waziralilah N. Fathiah, D. Shakirah Shukor

Abstract:

This paper presents the experimental study conducted on a structure of three-floor height building model. Most vibrations are undesirable and can cause damages to the buildings, machines and people all around us. The vibration wave from earthquakes, construction and winds have high potential to bring damage to the buildings. Excessive vibrations can result in structural and machinery failures. This failure is related to the human life and environment around it. The effect of vibration which causes failure and damage to the high rise buildings can be controlled in real life by implementing tuned mass damper (TMD) into the structure of the buildings. This research aims to study the effect and performance improvement achieved by applying TMD into the building structure. A structure model of three degrees of freedom (3DOF) is designed to demonstrate the performance of TMD to the designed model. The model designed is the physical representation of actual building structure in real life. It is constructed at a reduced scale and will be used for the experiment. Thus, the result obtained will be more accurate to compared with the real life effect. Based on the result from experimental study, by applying TMD to the structure model, the forces of vibration and the displacement mode of the building reduced. Thus, the reduced in vibration of the building helps to maintain the good condition of the building.

Keywords: degrees-of-freedom, displacement mode, natural frequency, tuned mass damper

Procedia PDF Downloads 335
19361 Existence and Uniqueness of Solutions to Singular Higher Order Two-Point BVPs on Time Scales

Authors: Zhenjie Liu

Abstract:

This paper investigates the existence and uniqueness of solutions for singular higher order boundary value problems on time scales by using mixed monotone method. The theorems obtained are very general. For the different time scale, the problem may be the corresponding continuous or discrete boundary value problem.

Keywords: mixed monotone operator, boundary value problem, time scale, green's function, positive solution, singularity

Procedia PDF Downloads 250
19360 Stochastic Energy and Reserve Scheduling with Wind Generation and Generic Energy Storage Systems

Authors: Amirhossein Khazali, Mohsen Kalantar

Abstract:

Energy storage units can play an important role to provide an economic and secure operation of future energy systems. In this paper, a stochastic energy and reserve market clearing scheme is presented considering storage energy units. The approach is proposed to deal with stochastic and non-dispatchable renewable sources with a high level of penetration in the energy system. A two stage stochastic programming scheme is formulated where in the first stage the energy market is cleared according to the forecasted amount of wind generation and demands and in the second stage the real time market is solved according to the assumed scenarios.

Keywords: energy and reserve market, energy storage device, stochastic programming, wind generation

Procedia PDF Downloads 566
19359 Study on Measuring Method and Experiment of Arc Fault Detection Device

Authors: Yang Jian-Hong, Zhang Ren-Cheng, Huang Li

Abstract:

Arc fault is one of the main inducements of electric fires. Arc Fault Detection Device (AFDD) can detect arc fault effectively. Arc fault detections and unhooking standards are the keys to AFDD practical application. First, an arc fault continuous production system was developed, which could count the arc half wave number. Then, Combining with the UL1699 standard, ignition probability curve of cotton and unhooking time of various currents intensity were obtained by experiments. The combustion degree of arc fault could be expressed effectively by arc area. Experiments proved that electric fires would be misjudged or missed only using arc half wave number as AFDD unhooking basis. At last, Practical tests were carried out on the self-developed AFDD system. The result showed that actual AFDD unhooking time was the sum of arc half wave cycling number, Arc wave identification time and unhooking mechanical operation time And the first two shared shorter time. Unhooking time standard depended on the shortest mechanical operation time.

Keywords: arc fault detection device, arc area, arc half wave, unhooking time, arc fault

Procedia PDF Downloads 500
19358 Comparison Analysis of Multi-Channel Echo Cancellation Using Adaptive Filters

Authors: Sahar Mobeen, Anam Rafique, Irum Baig

Abstract:

Acoustic echo cancellation in multichannel is a system identification application. In real time environment, signal changes very rapidly which required adaptive algorithms such as Least Mean Square (LMS), Leaky Least Mean Square (LLMS), Normalized Least Mean square (NLMS) and average (AFA) having high convergence rate and stable. LMS and NLMS are widely used adaptive algorithm due to less computational complexity and AFA used of its high convergence rate. This research is based on comparison of acoustic echo (generated in a room) cancellation thorough LMS, LLMS, NLMS, AFA and newly proposed average normalized leaky least mean square (ANLLMS) adaptive filters.

Keywords: LMS, LLMS, NLMS, AFA, ANLLMS

Procedia PDF Downloads 558
19357 Effect of MPPT and THD in Grid-Connected Photovoltaic System

Authors: Sajjad Yahaghifar

Abstract:

From the end of the last century, the importance and use of renewable energy sources have gained prominence, due not only by the fossil fuels dependence reduction, but mainly by environmental reasons related to climate change and the effects to the humanity. Consequently, solar energy has been arousing interest in several countries for being a technology considered clean, with reduced environmental impact. The output power of photo voltaic (PV) arrays is always changing with weather conditions,i.e., solar irradiation and atmospheric temperature. Therefore, maximum power point tracking (MPPT) control to extract maximum power from the PV arrays at real time becomes indispensable in PV generation system. This paper Study MPPT and total harmonic distortion (THD) in the city of Tabriz, Iran with the grid-connected PV system as distributed generation.

Keywords: MPPT, THD, grid-connected, PV system

Procedia PDF Downloads 395
19356 Congestion Control in Mobile Network by Prioritizing Handoff Calls

Authors: O. A. Lawal, O. A Ojesanmi

Abstract:

The demand for wireless cellular services continues to increase while the radio resources remain limited. Thus, network operators have to continuously manage the scarce radio resources in order to have an improved quality of service for mobile users. This paper proposes how to handle the problem of congestion in the mobile network by prioritizing handoff call, using the guard channel allocation scheme. The research uses specific threshold value for the time of allocation of the channel in the algorithm. The scheme would be simulated by generating various data for different traffics in the network as it would be in the real life. The result would be used to determine the probability of handoff call dropping and the probability of the new call blocking as a way of measuring the network performance.

Keywords: call block, channel, handoff, mobile cellular network

Procedia PDF Downloads 391
19355 INCIPIT-CRIS: A Research Information System Combining Linked Data Ontologies and Persistent Identifiers

Authors: David Nogueiras Blanco, Amir Alwash, Arnaud Gaudinat, René Schneider

Abstract:

At a time when the access to and the sharing of information are crucial in the world of research, the use of technologies such as persistent identifiers (PIDs), Current Research Information Systems (CRIS), and ontologies may create platforms for information sharing if they respond to the need of disambiguation of their data by assuring interoperability inside and between other systems. INCIPIT-CRIS is a continuation of the former INCIPIT project, whose goal was to set up an infrastructure for a low-cost attribution of PIDs with high granularity based on Archival Resource Keys (ARKs). INCIPIT-CRIS can be interpreted as a logical consequence and propose a research information management system developed from scratch. The system has been created on and around the Schema.org ontology with a further articulation of the use of ARKs. It is thus built upon the infrastructure previously implemented (i.e., INCIPIT) in order to enhance the persistence of URIs. As a consequence, INCIPIT-CRIS aims to be the hinge between previously separated aspects such as CRIS, ontologies and PIDs in order to produce a powerful system allowing the resolution of disambiguation problems using a combination of an ontology such as Schema.org and unique persistent identifiers such as ARK, allowing the sharing of information through a dedicated platform, but also the interoperability of the system by representing the entirety of the data as RDF triplets. This paper aims to present the implemented solution as well as its simulation in real life. We will describe the underlying ideas and inspirations while going through the logic and the different functionalities implemented and their links with ARKs and Schema.org. Finally, we will discuss the tests performed with our project partner, the Swiss Institute of Bioinformatics (SIB), by the use of large and real-world data sets.

Keywords: current research information systems, linked data, ontologies, persistent identifier, schema.org, semantic web

Procedia PDF Downloads 129
19354 Investigation of Complexity Dynamics in a DC Glow Discharge Magnetized Plasma Using Recurrence Quantification Analysis

Authors: Vramori Mitra, Bornali Sarma, Arun K. Sarma

Abstract:

Recurrence is a ubiquitous feature of any real dynamical system. The states in phase space trajectory of a system have an inherent tendency to return to the same state or its close state after certain time laps. Recurrence quantification analysis technique, based on this fundamental feature of a dynamical system, detects evaluation of state under variation of control parameter of the system. The paper presents the investigation of nonlinear dynamical behavior of plasma floating potential fluctuations obtained by using a Langmuir probe in different magnetic field under the variation of discharge voltages. The main measures of recurrence quantification analysis are considered as determinism, linemax and entropy. The increment of the DET and linemax variables asserts that the predictability and periodicity of the system is increasing. The variable linemax indicates that the chaoticity is being diminished with the slump of magnetic field while increase of magnetic field enhancing the chaotic behavior. Fractal property of the plasma time series estimated by DFA technique (Detrended fluctuation analysis) reflects that long-range correlation of plasma fluctuations is decreasing while fractal dimension is increasing with the enhancement of magnetic field which corroborates the RQA analysis.

Keywords: detrended fluctuation analysis, chaos, phase space, recurrence

Procedia PDF Downloads 324
19353 Downhole Logging and Dynamics Data Resolving Lithology-Related Drilling Behavior

Authors: Christopher Viens, Steve Krase

Abstract:

Terms such as “riding a hard streak”, “formation push”, and “fighting formation” are commonly used in the directional drilling world to explain BHA behavior that causes unwanted trajectory change. Theories about downhole directional tendencies are commonly speculated from various personal experiences with little merit due to the lack of hard data to reveal the actual mechanisms behind the phenomenon, leaving interpretation of the root cause up to personal perception. Understanding and identifying in real time the lithological factors that influence the BHA to change or hold direction adds tremendous value in terms reducing sliding time and targeting zones for optimal ROP. Utilizing surface drilling parameters and employing downhole measurements of azimuthal gamma, continuous inclination, and bending moment, a direct measure of the rock related directional phenomenon have been captured and quantified. Furthermore, identifying continuous zones of like lithology with consistent bit to rock interaction has value from a reservoir characterization and completions standpoint. The paper will show specific examples of lithology related directional tendencies from the Spraberry and Wolfcamp in the Delaware Basin.

Keywords: Azimuthal gamma imaging, bending moment, continuous inclination, downhole dynamics measurements, high frequency data

Procedia PDF Downloads 286
19352 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 409
19351 Investigation of Optical Requirements for Power System Assets Monitoring with Unmanned Aerial Vehicles

Authors: Ioana Pisica, Dimitrios Gkritzapis

Abstract:

The significance of UAS in scientific applications has been amply demonstrated in recent years. The combinations of portability and quasi-static positioning by means of flying in close loop path make them versatile and efficient in the inspection of power systems infrastructure. In this paper, we critically assess several platforms and sensor capabilities to identify their pros and cons in relation to the power systems assets to be monitored. In this respect, it is paramount the flights to be conducted by using UAS which bear certain suitable features, such as responsive and easy control, video capturing in real time, autonomous routing of pre-planned flight programming with differentiating payloads. The outcome of this research is a set of optimal requirements for power system assets monitoring with UAS.

Keywords: platforms, power system, sensors, UAVs

Procedia PDF Downloads 282
19350 Production Planning, Scheduling and SME

Authors: Markus Heck, Hans Vettiger

Abstract:

Small and medium-sized enterprises (SME) are the backbone of central Europe’s economies and have a significant contribution to the gross domestic product. Production planning and scheduling (PPS) is still a crucial element in manufacturing industries of the 21st century even though this area of research is more than a century old. The topic of PPS is well researched especially in the context of large enterprises in the manufacturing industry. However, the implementation of PPS methodologies within SME is mostly unobserved. This work analyzes how PPS is implemented in SME with the geographical focus on Switzerland and its vicinity. Based on restricted resources compared to large enterprises, SME have to face different challenges. The real problem areas of selected enterprises in regards of PPS are identified and evaluated. For the identified real-life problem areas of SME clear and detailed recommendations are created, covering concepts and best practices and the efficient usage of PPS. Furthermore, the economic and entrepreneurial value for companies is lined out and why the implementation of the introduced recommendations is advised.

Keywords: central Europe, PPS, production planning, SME

Procedia PDF Downloads 386
19349 Power Transformers Insulation Material Investigations: Partial Discharge

Authors: Jalal M. Abdallah

Abstract:

There is a great problem in testing and investigations the reliability of different type of transformers insulation materials. It summarized in how to create and simulate the real conditions of working transformer and testing its insulation materials for Partial Discharge PD, typically as in the working mode. A lot of tests may give untrue results as the physical behavior of the insulation material differs under tests from its working condition. In this work, the real working conditions were simulated, and a large number of specimens have been tested. The investigations first stage, begin with choosing samples of different types of insulation materials (papers, pressboards, etc.). The second stage, the samples were dried in ovens at 105 C0and 0.01bar for 48 hours, and then impregnated with dried and gasless oil (the water content less than 6 ppm.) at 105 C0and 0.01bar for 48 hours, after so specimen cooling at room pressure and temperature for 24 hours. The third stage is investigating PD for the samples using ICM PD measuring device. After that, a continuous test on oil-impregnated insulation materials (paper, pressboards) was developed, and the phase resolved partial discharge pattern of PD signals was measured. The important of this work in providing the industrial sector with trusted high accurate measuring results based on real simulated working conditions. All the PD patterns (results) associated with a discharge produced in well-controlled laboratory condition. They compared with other previous and other laboratory results. In addition, the influence of different temperatures condition on the partial discharge activities was studied.

Keywords: transformers, insulation materials, voids, partial discharge

Procedia PDF Downloads 313
19348 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 57
19347 Cognitive SATP for Airborne Radar Based on Slow-Time Coding

Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu

Abstract:

Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.

Keywords: space-time adaptive processing (STAP), airborne radar, signal-to-clutter ratio, slow-time coding

Procedia PDF Downloads 269
19346 Q-Efficient Solutions of Vector Optimization via Algebraic Concepts

Authors: Elham Kiyani

Abstract:

In this paper, we first introduce the concept of Q-efficient solutions in a real linear space not necessarily endowed with a topology, where Q is some nonempty (not necessarily convex) set. We also used the scalarization technique including the Gerstewitz function generated by a nonconvex set to characterize these Q-efficient solutions. The algebraic concepts of interior and closure are useful to study optimization problems without topology. Studying nonconvex vector optimization is valuable since topological interior is equal to algebraic interior for a convex cone. So, we use the algebraic concepts of interior and closure to define Q-weak efficient solutions and Q-Henig proper efficient solutions of set-valued optimization problems, where Q is not a convex cone. Optimization problems with set-valued maps have a wide range of applications, so it is expected that there will be a useful analytical tool in optimization theory for set-valued maps. These kind of optimization problems are closely related to stochastic programming, control theory, and economic theory. The paper focus on nonconvex problems, the results are obtained by assuming generalized non-convexity assumptions on the data of the problem. In convex problems, main mathematical tools are convex separation theorems, alternative theorems, and algebraic counterparts of some usual topological concepts, while in nonconvex problems, we need a nonconvex separation function. Thus, we consider the Gerstewitz function generated by a general set in a real linear space and re-examine its properties in the more general setting. A useful approach for solving a vector problem is to reduce it to a scalar problem. In general, scalarization means the replacement of a vector optimization problem by a suitable scalar problem which tends to be an optimization problem with a real valued objective function. The Gerstewitz function is well known and widely used in optimization as the basis of the scalarization. The essential properties of the Gerstewitz function, which are well known in the topological framework, are studied by using algebraic counterparts rather than the topological concepts of interior and closure. Therefore, properties of the Gerstewitz function, when it takes values just in a real linear space are studied, and we use it to characterize Q-efficient solutions of vector problems whose image space is not endowed with any particular topology. Therefore, we deal with a constrained vector optimization problem in a real linear space without assuming any topology, and also Q-weak efficient and Q-proper efficient solutions in the senses of Henig are defined. Moreover, by means of the Gerstewitz function, we provide some necessary and sufficient optimality conditions for set-valued vector optimization problems.

Keywords: algebraic interior, Gerstewitz function, vector closure, vector optimization

Procedia PDF Downloads 213
19345 Exploring Deep Neural Network Compression: An Overview

Authors: Ghorab Sara, Meziani Lila, Rubin Harvey Stuart

Abstract:

The rapid growth of deep learning has led to intricate and resource-intensive deep neural networks widely used in computer vision tasks. However, their complexity results in high computational demands and memory usage, hindering real-time application. To address this, research focuses on model compression techniques. The paper provides an overview of recent advancements in compressing neural networks and categorizes the various methods into four main approaches: network pruning, quantization, network decomposition, and knowledge distillation. This paper aims to provide a comprehensive outline of both the advantages and limitations of each method.

Keywords: model compression, deep neural network, pruning, knowledge distillation, quantization, low-rank decomposition

Procedia PDF Downloads 38
19344 Twitter's Impact on Print Media with Respect to Real World Events

Authors: Basit Shahzad, Abdullatif M. Abdullatif

Abstract:

Recent advancements in Information and Communication Technologies (ICT) and easy access to Internet have made social media the first choice for information sharing related to any important events or news. On Twitter, trend is a common feature that quantifies the level of popularity of a certain news or event. In this work, we examine the impact of Twitter trends on real world events by hypothesizing that Twitter trends have an influence on print media in Pakistan. For this, Twitter is used as a platform and Twitter trends as a base line. We first collect data from two sources (Twitter trends and print media) in the period May to August 2016. Obtained data from two sources is analyzed and it is observed that social media is significantly influencing the print media and majority of the news printed in newspaper are posted on Twitter earlier.

Keywords: twitter trends, text mining, effectiveness of trends, print media

Procedia PDF Downloads 256
19343 pscmsForecasting: A Python Web Service for Time Series Forecasting

Authors: Ioannis Andrianakis, Vasileios Gkatas, Nikos Eleftheriadis, Alexios Ellinidis, Ermioni Avramidou

Abstract:

pscmsForecasting is an open-source web service that implements a variety of time series forecasting algorithms and exposes them to the user via the ubiquitous HTTP protocol. It allows developers to enhance their applications by adding time series forecasting functionalities through an intuitive and easy-to-use interface. This paper provides some background on time series forecasting and gives details about the implemented algorithms, aiming to enhance the end user’s understanding of the underlying methods before incorporating them into their applications. A detailed description of the web service’s interface and its various parameterizations is also provided. Being an open-source project, pcsmsForecasting can also be easily modified and tailored to the specific needs of each application.

Keywords: time series, forecasting, web service, open source

Procedia PDF Downloads 78
19342 The Study of Cost Accounting in S Company Based on TDABC

Authors: Heng Ma

Abstract:

Third-party warehousing logistics has an important role in the development of external logistics. At present, the third-party logistics in our country is still a new industry, the accounting system has not yet been established, the current financial accounting system of third-party warehousing logistics is mainly in the traditional way of thinking, and only able to provide the total cost information of the entire enterprise during the accounting period, unable to reflect operating indirect cost information. In order to solve the problem of third-party logistics industry cost information distortion, improve the level of logistics cost management, the paper combines theoretical research and case analysis method to reflect cost allocation by building third-party logistics costing model using Time-Driven Activity-Based Costing(TDABC), and takes S company as an example to account and control the warehousing logistics cost. Based on the idea of “Products consume activities and activities consume resources”, TDABC put time into the main cost driver and use time-consuming equation resources assigned to cost objects. In S company, the objects focuses on three warehouse, engaged with warehousing and transportation (the second warehouse, transport point) service. These three warehouse respectively including five departments, Business Unit, Production Unit, Settlement Center, Security Department and Equipment Division, the activities in these departments are classified by in-out of storage forecast, in-out of storage or transit and safekeeping work. By computing capacity cost rate, building the time-consuming equation, the paper calculates the final operation cost so as to reveal the real cost. The numerical analysis results show that the TDABC can accurately reflect the cost allocation of service customers and reveal the spare capacity cost of resource center, verifies the feasibility and validity of TDABC in third-party logistics industry cost accounting. It inspires enterprises focus on customer relationship management and reduces idle cost to strengthen the cost management of third-party logistics enterprises.

Keywords: third-party logistics enterprises, TDABC, cost management, S company

Procedia PDF Downloads 357
19341 Applied Actuator Fault Accommodation in Flight Control Systems Using Fault Reconstruction Based FDD and SMC Reconfiguration

Authors: A. Ghodbane, M. Saad, J. F. Boland, C. Thibeault

Abstract:

Historically, actuators’ redundancy was used to deal with faults occurring suddenly in flight systems. This technique was generally expensive, time consuming and involves increased weight and space in the system. Therefore, nowadays, the on-line fault diagnosis of actuators and accommodation plays a major role in the design of avionic systems. These approaches, known as Fault Tolerant Flight Control systems (FTFCs) are able to adapt to such sudden faults while keeping avionics systems lighter and less expensive. In this paper, a (FTFC) system based on the Geometric Approach and a Reconfigurable Flight Control (RFC) are presented. The Geometric approach is used for cosmic ray fault reconstruction, while Sliding Mode Control (SMC) based on Lyapunov stability theory is designed for the reconfiguration of the controller in order to compensate the fault effect. Matlab®/Simulink® simulations are performed to illustrate the effectiveness and robustness of the proposed flight control system against actuators’ faulty signal caused by cosmic rays. The results demonstrate the successful real-time implementation of the proposed FTFC system on a non-linear 6 DOF aircraft model.

Keywords: actuators’ faults, fault detection and diagnosis, fault tolerant flight control, sliding mode control, geometric approach for fault reconstruction, Lyapunov stability

Procedia PDF Downloads 413
19340 Development and Evaluation of a Cognitive Behavioural Therapy Based Smartphone App for Low Moods and Anxiety

Authors: David Bakker, Nikki Rickard

Abstract:

Smartphone apps hold immense potential as mental health and wellbeing tools. Support can be made easily accessible and can be used in real-time while users are experiencing distress. Furthermore, data can be collected to enable machine learning and automated tailoring of support to users. While many apps have been developed for mental health purposes, few have adhered to evidence-based recommendations and even fewer have pursued experimental validation. This paper details the development and experimental evaluation of an app, MoodMission, that aims to provide support for low moods and anxiety, help prevent clinical depression and anxiety disorders, and serve as an adjunct to professional clinical supports. MoodMission was designed to deliver cognitive behavioural therapy for specifically reported problems in real-time, momentary interactions. Users report their low moods or anxious feelings to the app along with a subjective units of distress scale (SUDS) rating. MoodMission then provides a choice of 5-10 short, evidence-based mental health strategies called Missions. Users choose a Mission, complete it, and report their distress again. Automated tailoring, gamification, and in-built data collection for analysis of effectiveness was also included in the app’s design. The development process involved construction of an evidence-based behavioural plan, designing of the app, building and testing procedures, feedback-informed changes, and a public launch. A randomized controlled trial (RCT) was conducted comparing MoodMission to two other apps and a waitlist control condition. Participants completed measures of anxiety, depression, well-being, emotional self-awareness, coping self-efficacy and mental health literacy at the start of their app use and 30 days later. At the time of submission (November 2016) over 300 participants have participated in the RCT. Data analysis will begin in January 2017. At the time of this submission, MoodMission has over 4000 users. A repeated-measures ANOVA of 1390 completed Missions reveals that SUDS (0-10) ratings were significantly reduced between pre-Mission ratings (M=6.20, SD=2.39) and post-Mission ratings (M=4.93, SD=2.25), F(1,1389)=585.86, p < .001, np2=.30. This effect was consistent across both low moods and anxiety. Preliminary analyses of the data from the outcome measures surveys reveal improvements across mental health and wellbeing measures as a result of using the app over 30 days. This includes a significant increase in coping self-efficacy, F(1,22)=5.91, p=.024, np2=.21. Complete results from the RCT in which MoodMission was evaluated will be presented. Results will also be presented from the continuous outcome data being recorded by MoodMission. MoodMission was successfully developed and launched, and preliminary analysis suggest that it is an effective mental health and wellbeing tool. In addition to the clinical applications of MoodMission, the app holds promise as a research tool to conduct component analysis of psychological therapies and overcome restraints of laboratory based studies. The support provided by the app is discrete, tailored, evidence-based, and transcends barriers of stigma, geographic isolation, financial limitations, and low health literacy.

Keywords: anxiety, app, CBT, cognitive behavioural therapy, depression, eHealth, mission, mobile, mood, MoodMission

Procedia PDF Downloads 267
19339 Release Management with Continuous Delivery: A Case Study

Authors: A. Maruf Aytekin

Abstract:

We present our approach on using continuous delivery pattern for release management. One of the key practices of agile and lean teams is the continuous delivery of new features to stakeholders. The main benefits of this approach lie in the ability to release new applications rapidly which has real strategic impact on the competitive advantage of an organization. Organizations that successfully implement Continuous Delivery have the ability to evolve rapidly to support innovation, provide stable and reliable software in more efficient ways, decrease the amount of resources need for maintenance, and lower the software delivery time and costs. One of the objectives of this paper is to elaborate a case study where IT division of Central Securities Depository Institution (MKK) of Turkey apply Continuous Delivery pattern to improve release management process.

Keywords: automation, continuous delivery, deployment, release management

Procedia PDF Downloads 251
19338 Visualization Tool for EEG Signal Segmentation

Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh

Abstract:

This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.

Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation

Procedia PDF Downloads 392
19337 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 99