Search results for: real time digital simulator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22642

Search results for: real time digital simulator

20092 Simulation Study on Effects of Surfactant Properties on Surfactant Enhanced Oil Recovery from Fractured Reservoirs

Authors: Xiaoqian Cheng, Jon Kleppe, Ole Torsaeter

Abstract:

One objective of this work is to analyze the effects of surfactant properties (viscosity, concentration, and adsorption) on surfactant enhanced oil recovery at laboratory scale. The other objective is to obtain the functional relationships between surfactant properties and the ultimate oil recovery and oil recovery rate. A core is cut into two parts from the middle to imitate the matrix with a horizontal fracture. An injector and a producer are at the left and right sides of the fracture separately. The middle slice of the core is used as the model in this paper, whose size is 4cm x 0.1cm x 4.1cm, and the space of the fracture in the middle is 0.1 cm. The original properties of matrix, brine, oil in the base case are from Ekofisk Field. The properties of surfactant are from literature. Eclipse is used as the simulator. The results are followings: 1) The viscosity of surfactant solution has a positive linear relationship with surfactant oil recovery time. And the relationship between viscosity and oil production rate is an inverse function. The viscosity of surfactant solution has no obvious effect on ultimate oil recovery. Since most of the surfactant has no big effect on viscosity of brine, the viscosity of surfactant solution is not a key parameter of surfactant screening for surfactant flooding in fractured reservoirs. 2) The increase of surfactant concentration results a decrease of oil recovery rate and an increase of ultimate oil recovery. However, there are no functions could describe the relationships. Study on economy should be conducted because of the price of surfactant and oil. 3) In the study of surfactant adsorption, assume that the matrix wettability is changed to water-wet when the surfactant adsorption is to the maximum at all cases. And the ratio of surfactant adsorption and surfactant concentration (Cads/Csurf) is used to estimate the functional relationship. The results show that the relationship between ultimate oil recovery and Cads/Csurf is a logarithmic function. The oil production rate has a positive linear relationship with exp(Cads/Csurf). The work here could be used as a reference for the surfactant screening of surfactant enhanced oil recovery from fractured reservoirs. And the functional relationships between surfactant properties and the oil recovery rate and ultimate oil recovery help to improve upscaling methods.

Keywords: fractured reservoirs, surfactant adsorption, surfactant concentration, surfactant EOR, surfactant viscosity

Procedia PDF Downloads 169
20091 Congestion Control in Mobile Network by Prioritizing Handoff Calls

Authors: O. A. Lawal, O. A Ojesanmi

Abstract:

The demand for wireless cellular services continues to increase while the radio resources remain limited. Thus, network operators have to continuously manage the scarce radio resources in order to have an improved quality of service for mobile users. This paper proposes how to handle the problem of congestion in the mobile network by prioritizing handoff call, using the guard channel allocation scheme. The research uses specific threshold value for the time of allocation of the channel in the algorithm. The scheme would be simulated by generating various data for different traffics in the network as it would be in the real life. The result would be used to determine the probability of handoff call dropping and the probability of the new call blocking as a way of measuring the network performance.

Keywords: call block, channel, handoff, mobile cellular network

Procedia PDF Downloads 391
20090 Investigation of Complexity Dynamics in a DC Glow Discharge Magnetized Plasma Using Recurrence Quantification Analysis

Authors: Vramori Mitra, Bornali Sarma, Arun K. Sarma

Abstract:

Recurrence is a ubiquitous feature of any real dynamical system. The states in phase space trajectory of a system have an inherent tendency to return to the same state or its close state after certain time laps. Recurrence quantification analysis technique, based on this fundamental feature of a dynamical system, detects evaluation of state under variation of control parameter of the system. The paper presents the investigation of nonlinear dynamical behavior of plasma floating potential fluctuations obtained by using a Langmuir probe in different magnetic field under the variation of discharge voltages. The main measures of recurrence quantification analysis are considered as determinism, linemax and entropy. The increment of the DET and linemax variables asserts that the predictability and periodicity of the system is increasing. The variable linemax indicates that the chaoticity is being diminished with the slump of magnetic field while increase of magnetic field enhancing the chaotic behavior. Fractal property of the plasma time series estimated by DFA technique (Detrended fluctuation analysis) reflects that long-range correlation of plasma fluctuations is decreasing while fractal dimension is increasing with the enhancement of magnetic field which corroborates the RQA analysis.

Keywords: detrended fluctuation analysis, chaos, phase space, recurrence

Procedia PDF Downloads 324
20089 Downhole Logging and Dynamics Data Resolving Lithology-Related Drilling Behavior

Authors: Christopher Viens, Steve Krase

Abstract:

Terms such as “riding a hard streak”, “formation push”, and “fighting formation” are commonly used in the directional drilling world to explain BHA behavior that causes unwanted trajectory change. Theories about downhole directional tendencies are commonly speculated from various personal experiences with little merit due to the lack of hard data to reveal the actual mechanisms behind the phenomenon, leaving interpretation of the root cause up to personal perception. Understanding and identifying in real time the lithological factors that influence the BHA to change or hold direction adds tremendous value in terms reducing sliding time and targeting zones for optimal ROP. Utilizing surface drilling parameters and employing downhole measurements of azimuthal gamma, continuous inclination, and bending moment, a direct measure of the rock related directional phenomenon have been captured and quantified. Furthermore, identifying continuous zones of like lithology with consistent bit to rock interaction has value from a reservoir characterization and completions standpoint. The paper will show specific examples of lithology related directional tendencies from the Spraberry and Wolfcamp in the Delaware Basin.

Keywords: Azimuthal gamma imaging, bending moment, continuous inclination, downhole dynamics measurements, high frequency data

Procedia PDF Downloads 285
20088 INCIPIT-CRIS: A Research Information System Combining Linked Data Ontologies and Persistent Identifiers

Authors: David Nogueiras Blanco, Amir Alwash, Arnaud Gaudinat, René Schneider

Abstract:

At a time when the access to and the sharing of information are crucial in the world of research, the use of technologies such as persistent identifiers (PIDs), Current Research Information Systems (CRIS), and ontologies may create platforms for information sharing if they respond to the need of disambiguation of their data by assuring interoperability inside and between other systems. INCIPIT-CRIS is a continuation of the former INCIPIT project, whose goal was to set up an infrastructure for a low-cost attribution of PIDs with high granularity based on Archival Resource Keys (ARKs). INCIPIT-CRIS can be interpreted as a logical consequence and propose a research information management system developed from scratch. The system has been created on and around the Schema.org ontology with a further articulation of the use of ARKs. It is thus built upon the infrastructure previously implemented (i.e., INCIPIT) in order to enhance the persistence of URIs. As a consequence, INCIPIT-CRIS aims to be the hinge between previously separated aspects such as CRIS, ontologies and PIDs in order to produce a powerful system allowing the resolution of disambiguation problems using a combination of an ontology such as Schema.org and unique persistent identifiers such as ARK, allowing the sharing of information through a dedicated platform, but also the interoperability of the system by representing the entirety of the data as RDF triplets. This paper aims to present the implemented solution as well as its simulation in real life. We will describe the underlying ideas and inspirations while going through the logic and the different functionalities implemented and their links with ARKs and Schema.org. Finally, we will discuss the tests performed with our project partner, the Swiss Institute of Bioinformatics (SIB), by the use of large and real-world data sets.

Keywords: current research information systems, linked data, ontologies, persistent identifier, schema.org, semantic web

Procedia PDF Downloads 129
20087 The Impact of Information and Communications Technology (ICT)-Enabled Service Adaptation on Quality of Life: Insights from Taiwan

Authors: Chiahsu Yang, Peiling Wu, Ted Ho

Abstract:

From emphasizing economic development to stressing public happiness, the international community mainly hopes to be able to understand whether the quality of life for the public is becoming better. The Better Life Index (BLI) constructed by OECD uses living conditions and quality of life as starting points to cover 11 areas of life and to convey the state of the general public’s well-being. In light of the BLI framework, the Directorate General of Budget, Accounting and Statistics (DGBAS) of the Executive Yuan instituted the Gross National Happiness Index to understand the needs of the general public and to measure the progress of the aforementioned conditions in residents across the island. Whereas living conditions consist of income and wealth, jobs and earnings, and housing conditions, health status, work and life balance, education and skills, social connections, civic engagement and governance, environmental quality, personal security. The ICT area consists of health care, living environment, ICT-enabled communication, transportation, government, education, pleasure, purchasing, job & employment. In the wake of further science and technology development, rapid formation of information societies, and closer integration between lifestyles and information societies, the public’s well-being within information societies has indeed become a noteworthy topic. the Board of Science and Technology of the Executive Yuan use the OECD’s BLI as a reference in the establishment of the Taiwan-specific ICT-Enabled Better Life Index. Using this index, the government plans to examine whether the public’s quality of life is improving as well as measure the public’s satisfaction with current digital quality of life. This understanding will enable the government to gauge the degree of influence and impact that each dimension of digital services has on digital life happiness while also serving as an important reference for promoting digital service development. The content of the ICT Enabled Better Life Index. Information and communications technology (ICT) has been affecting people’s living styles, and further impact people’s quality of life (QoL). Even studies have shown that ICT access and usage have both positive and negative impact on life satisfaction and well-beings, many governments continue to invest in e-government programs to initiate their path to information society. This research is the few attempts to link the e-government benchmark to the subjective well-being perception, and further address the gap between user’s perception and existing hard data assessment, then propose a model to trace measurement results back to the original public policy in order for policy makers to justify their future proposals.

Keywords: information and communications technology, quality of life, satisfaction, well-being

Procedia PDF Downloads 348
20086 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 409
20085 Investigation of Optical Requirements for Power System Assets Monitoring with Unmanned Aerial Vehicles

Authors: Ioana Pisica, Dimitrios Gkritzapis

Abstract:

The significance of UAS in scientific applications has been amply demonstrated in recent years. The combinations of portability and quasi-static positioning by means of flying in close loop path make them versatile and efficient in the inspection of power systems infrastructure. In this paper, we critically assess several platforms and sensor capabilities to identify their pros and cons in relation to the power systems assets to be monitored. In this respect, it is paramount the flights to be conducted by using UAS which bear certain suitable features, such as responsive and easy control, video capturing in real time, autonomous routing of pre-planned flight programming with differentiating payloads. The outcome of this research is a set of optimal requirements for power system assets monitoring with UAS.

Keywords: platforms, power system, sensors, UAVs

Procedia PDF Downloads 282
20084 Delineating Floodplain along the Nasia River in Northern Ghana Using HAND Contour

Authors: Benjamin K. Ghansah, Richard K. Appoh, Iliya Nababa, Eric K. Forkuo

Abstract:

The Nasia River is an important source of water for domestic and agricultural purposes to the inhabitants of its catchment. Major farming activities takes place within the floodplain of the river and its network of tributaries. The actual inundation extent of the river system is; however, unknown. Reasons for this lack of information include financial constraints and inadequate human resources as flood modelling is becoming increasingly complex by the day. Knowledge of the inundation extent will help in the assessment of risk posed by the annual flooding of the river, and help in the planning of flood recession agricultural activities. This study used a simple terrain based algorithm, Height Above Nearest Drainage (HAND), to delineate the floodplain of the Nasia River and its tributaries. The HAND model is a drainage normalized digital elevation model, which has its height reference based on the local drainage systems rather than the average mean sea level (AMSL). The underlying principle guiding the development of the HAND model is that hillslope flow paths behave differently when the reference gradient is to the local drainage network as compared to the seaward gradient. The new terrain model of the catchment was created using the NASA’s SRTM Digital Elevation Model (DEM) 30m as the only data input. Contours (HAND Contour) were then generated from the normalized DEM. Based on field flood inundation survey, historical information of flooding of the area as well as satellite images, a HAND Contour of 2m was found to best correlates with the flood inundation extent of the river and its tributaries. A percentage accuracy of 75% was obtained when the surface area created by the 2m contour was compared with surface area of the floodplain computed from a satellite image captured during the peak flooding season in September 2016. It was estimated that the flooding of the Nasia River and its tributaries created a floodplain area of 1011 km².

Keywords: digital elevation model, floodplain, HAND contour, inundation extent, Nasia River

Procedia PDF Downloads 448
20083 Towards an Effective Approach for Modelling near Surface Air Temperature Combining Weather and Satellite Data

Authors: Nicola Colaninno, Eugenio Morello

Abstract:

The urban environment affects local-to-global climate and, in turn, suffers global warming phenomena, with worrying impacts on human well-being, health, social and economic activities. Physic-morphological features of the built-up space affect urban air temperature, locally, causing the urban environment to be warmer compared to surrounding rural. This occurrence, typically known as the Urban Heat Island (UHI), is normally assessed by means of air temperature from fixed weather stations and/or traverse observations or based on remotely sensed Land Surface Temperatures (LST). The information provided by ground weather stations is key for assessing local air temperature. However, the spatial coverage is normally limited due to low density and uneven distribution of the stations. Although different interpolation techniques such as Inverse Distance Weighting (IDW), Ordinary Kriging (OK), or Multiple Linear Regression (MLR) are used to estimate air temperature from observed points, such an approach may not effectively reflect the real climatic conditions of an interpolated point. Quantifying local UHI for extensive areas based on weather stations’ observations only is not practicable. Alternatively, the use of thermal remote sensing has been widely investigated based on LST. Data from Landsat, ASTER, or MODIS have been extensively used. Indeed, LST has an indirect but significant influence on air temperatures. However, high-resolution near-surface air temperature (NSAT) is currently difficult to retrieve. Here we have experimented Geographically Weighted Regression (GWR) as an effective approach to enable NSAT estimation by accounting for spatial non-stationarity of the phenomenon. The model combines on-site measurements of air temperature, from fixed weather stations and satellite-derived LST. The approach is structured upon two main steps. First, a GWR model has been set to estimate NSAT at low resolution, by combining air temperature from discrete observations retrieved by weather stations (dependent variable) and the LST from satellite observations (predictor). At this step, MODIS data, from Terra satellite, at 1 kilometer of spatial resolution have been employed. Two time periods are considered according to satellite revisit period, i.e. 10:30 am and 9:30 pm. Afterward, the results have been downscaled at 30 meters of spatial resolution by setting a GWR model between the previously retrieved near-surface air temperature (dependent variable), the multispectral information as provided by the Landsat mission, in particular the albedo, and Digital Elevation Model (DEM) from the Shuttle Radar Topography Mission (SRTM), both at 30 meters. Albedo and DEM are now the predictors. The area under investigation is the Metropolitan City of Milan, which covers an area of approximately 1,575 km2 and encompasses a population of over 3 million inhabitants. Both models, low- (1 km) and high-resolution (30 meters), have been validated according to a cross-validation that relies on indicators such as R2, Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE). All the employed indicators give evidence of highly efficient models. In addition, an alternative network of weather stations, available for the City of Milano only, has been employed for testing the accuracy of the predicted temperatures, giving and RMSE of 0.6 and 0.7 for daytime and night-time, respectively.

Keywords: urban climate, urban heat island, geographically weighted regression, remote sensing

Procedia PDF Downloads 190
20082 Ultrasonographic Evaluation of Tars and Metatars Region of Dromedary Camel

Authors: Aboozar Dehghan, S. Sharifi, A. Ardeshiri, F. Jafari, F. Samani

Abstract:

Ultrasonography is a safe, particular, available and easy to use method to evaluate soft tissues. Tendons play the main role to body locomotors system. Ultrasonography performed in tarsus and metatarsus region of rare limb of eight adult, Dromedary camels (camelus dromedaries) in both sex. Clinical examination and gate analysis was performed before slaughtering. From the tarsus to the 1st phalanx was divided to 4 equal region include 1a, 2a, 1b and 2b. Flexor surface was clipped and covered by enough ultrasonography gel. Ultrasonography was performed by linear phased array 8-12 Mhz transducer in transverse and longitudinal section and Superficial digital flexor tendon (SDFT), deep digital flexor tendon (DDFT) and suspensory ligament (SL) were imaged. Echogenicity and diameter of these structures were recorded. Size of tendons and SL measured after necropsy too. statistical analysis obtained that SDFT diameter larger than others in all described regions and mean of DDFT diameter larger than suspensory ligament. Echogenicity of SL more than SDFT and DDFT. No Significant relationship was seen between left and right rare limb structures size. Between sex and tendons and SL diameter, significant relationship not seen.

Keywords: dromedary camel, tars and metatars, ultrasonography

Procedia PDF Downloads 554
20081 Cognitive SATP for Airborne Radar Based on Slow-Time Coding

Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu

Abstract:

Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.

Keywords: space-time adaptive processing (STAP), airborne radar, signal-to-clutter ratio, slow-time coding

Procedia PDF Downloads 269
20080 Mobile Technology Use by People with Learning Disabilities: A Qualitative Study

Authors: Peter Williams

Abstract:

Mobile digital technology, in the form of smart phones, tablets, laptops and their accompanying functionality/apps etc., is becoming ever more used by people with Learning Disabilities (LD) - for entertainment, to communicate and socialize, and enjoy self-expression. Despite this, there has been very little research into the experiences of such technology by this cohort, it’s role in articulating personal identity and self-advocacy and the barriers encountered in negotiating technology in everyday life. The proposed talk describes research funded by the British Academy addressing these issues. It aims to explore: i) the experiences of people with LD in using mobile technology in their everyday lives – the benefits, in terms of entertainment, self-expression and socialising, and possible greater autonomy; and the barriers, such as accessibility or usability issues, privacy or vulnerability concerns etc. ii) how the technology, and in particular the software/apps and interfaces, can be improved to enable the greater access to entertainment, information, communication and other benefits it can offer. It is also hoped that results will inform parents, carers and other supporters regarding how they can use the technology with their charges. Rather than the project simply following the standard research procedure of gathering and analysing ‘data’ to which individual ‘research subjects’ have no access, people with Learning Disabilities (and their supporters) will help co-produce an accessible, annotated and hyperlinked living e-archive of their experiences. Involving people with LD as informants, contributors and, in effect, co-researchers will facilitate digital inclusion and empowerment. The project is working with approximately 80 adults of all ages who have ‘mild’ learning disabilities (people who are able to read basic texts and write simple sentences). A variety of methods is being used. Small groups of participants have engaged in simple discussions or storytelling about some aspect of technology (such as ‘when my phone saved me’ or ‘my digital photos’ etc.). Some individuals have been ‘interviewed’ at a PC, laptop or with a mobile device etc., and asked to demonstrate their usage and interests. Social media users have shown their Facebook pages, Pinterest uploads or other material – giving them an additional focus they have used to discuss their ‘digital’ lives. During these sessions, participants have recorded (or employed the researcher to record) their observations on to the e-archive. Parents, carers and other supporters are also being interviewed to explore their experiences of using mobile technology with the cohort, including any difficulties they have observed their charges having. The archive is supplemented with these observations. The presentation will outline the methods described above, highlighting some of the special considerations required when working inclusively with people with LD. It will describe some of the preliminary findings and demonstrate the e-archive with a commentary on the pages shown.

Keywords: inclusive research, learning disabilities, methods, technology

Procedia PDF Downloads 221
20079 Voices of Dissent: Case Study of a Digital Archive of Testimonies of Political Oppression

Authors: Andrea Scapolo, Zaya Rustamova, Arturo Matute Castro

Abstract:

The “Voices in Dissent” initiative aims at collecting and making available in a digital format, testimonies, letters, and other narratives produced by victims of political oppression from different geographical spaces across the Atlantic. By recovering silenced voices behind the official narratives, this open-access online database will provide indispensable tools for rewriting the history of authoritarian regimes from the margins as memory debates continue to provoke controversy among academic and popular transnational circles. In providing an extensive database of non-hegemonic discourses in a variety of political and social contexts, the project will complement the existing European and Latin-American studies, and invite further interdisciplinary and trans-national research. This digital resource will be available to academic communities and the general audience and will be organized geographically and chronologically. “Voices in Dissent” will offer a first comprehensive study of these personal accounts of persecution and repression against determined historical backgrounds and their impact on collective memory formation in contemporary societies. The digitalization of these texts will allow to run metadata analyses and adopt comparatist approaches for a broad range of research endeavors. Most of the testimonies included in our archive are testimonies of trauma: the trauma of exile, imprisonment, torture, humiliation, censorship. The research on trauma has now reached critical mass and offers a broad spectrum of critical perspectives. By putting together testimonies from different geographical and historical contexts, our project will provide readers and scholars with an extraordinary opportunity to investigate how culture shapes individual and collective memories and provides or denies resources to make sense and cope with the trauma. For scholars dealing with the epistemological and rhetorical analysis of testimonies, an online open-access archive will prove particularly beneficial to test theories on truth status and the formation of belief as well as to study the articulation of discourse. An important aspect of this project is also its pedagogical applications since it will contribute to the creation of Open Educational Resources (OER) to support students and educators worldwide. Through collaborations with our Library System, the archive will form part of the Digital Commons database. The texts collected in this online archive will be made available in the original languages as well as in English translation. They will be accompanied by a critical apparatus that will contextualize them historically by providing relevant background information and bibliographical references. All these materials can serve as a springboard for a broad variety of educational projects and classroom activities. They can also be used to design specific content courses or modules. In conclusion, the desirable outcomes of the “Voices in Dissent” project are: 1. the collections and digitalization of political dissent testimonies; 2. the building of a network of scholars, educators, and learners involved in the design, development, and sustainability of the digital archive; 3. the integration of the content of the archive in both research and teaching endeavors, such as publication of scholarly articles, design of new upper-level courses, and integration of the materials in existing courses.

Keywords: digital archive, dissent, open educational resources, testimonies, transatlantic studies

Procedia PDF Downloads 102
20078 Production Planning, Scheduling and SME

Authors: Markus Heck, Hans Vettiger

Abstract:

Small and medium-sized enterprises (SME) are the backbone of central Europe’s economies and have a significant contribution to the gross domestic product. Production planning and scheduling (PPS) is still a crucial element in manufacturing industries of the 21st century even though this area of research is more than a century old. The topic of PPS is well researched especially in the context of large enterprises in the manufacturing industry. However, the implementation of PPS methodologies within SME is mostly unobserved. This work analyzes how PPS is implemented in SME with the geographical focus on Switzerland and its vicinity. Based on restricted resources compared to large enterprises, SME have to face different challenges. The real problem areas of selected enterprises in regards of PPS are identified and evaluated. For the identified real-life problem areas of SME clear and detailed recommendations are created, covering concepts and best practices and the efficient usage of PPS. Furthermore, the economic and entrepreneurial value for companies is lined out and why the implementation of the introduced recommendations is advised.

Keywords: central Europe, PPS, production planning, SME

Procedia PDF Downloads 385
20077 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 56
20076 pscmsForecasting: A Python Web Service for Time Series Forecasting

Authors: Ioannis Andrianakis, Vasileios Gkatas, Nikos Eleftheriadis, Alexios Ellinidis, Ermioni Avramidou

Abstract:

pscmsForecasting is an open-source web service that implements a variety of time series forecasting algorithms and exposes them to the user via the ubiquitous HTTP protocol. It allows developers to enhance their applications by adding time series forecasting functionalities through an intuitive and easy-to-use interface. This paper provides some background on time series forecasting and gives details about the implemented algorithms, aiming to enhance the end user’s understanding of the underlying methods before incorporating them into their applications. A detailed description of the web service’s interface and its various parameterizations is also provided. Being an open-source project, pcsmsForecasting can also be easily modified and tailored to the specific needs of each application.

Keywords: time series, forecasting, web service, open source

Procedia PDF Downloads 78
20075 Power Transformers Insulation Material Investigations: Partial Discharge

Authors: Jalal M. Abdallah

Abstract:

There is a great problem in testing and investigations the reliability of different type of transformers insulation materials. It summarized in how to create and simulate the real conditions of working transformer and testing its insulation materials for Partial Discharge PD, typically as in the working mode. A lot of tests may give untrue results as the physical behavior of the insulation material differs under tests from its working condition. In this work, the real working conditions were simulated, and a large number of specimens have been tested. The investigations first stage, begin with choosing samples of different types of insulation materials (papers, pressboards, etc.). The second stage, the samples were dried in ovens at 105 C0and 0.01bar for 48 hours, and then impregnated with dried and gasless oil (the water content less than 6 ppm.) at 105 C0and 0.01bar for 48 hours, after so specimen cooling at room pressure and temperature for 24 hours. The third stage is investigating PD for the samples using ICM PD measuring device. After that, a continuous test on oil-impregnated insulation materials (paper, pressboards) was developed, and the phase resolved partial discharge pattern of PD signals was measured. The important of this work in providing the industrial sector with trusted high accurate measuring results based on real simulated working conditions. All the PD patterns (results) associated with a discharge produced in well-controlled laboratory condition. They compared with other previous and other laboratory results. In addition, the influence of different temperatures condition on the partial discharge activities was studied.

Keywords: transformers, insulation materials, voids, partial discharge

Procedia PDF Downloads 311
20074 Exploring Deep Neural Network Compression: An Overview

Authors: Ghorab Sara, Meziani Lila, Rubin Harvey Stuart

Abstract:

The rapid growth of deep learning has led to intricate and resource-intensive deep neural networks widely used in computer vision tasks. However, their complexity results in high computational demands and memory usage, hindering real-time application. To address this, research focuses on model compression techniques. The paper provides an overview of recent advancements in compressing neural networks and categorizes the various methods into four main approaches: network pruning, quantization, network decomposition, and knowledge distillation. This paper aims to provide a comprehensive outline of both the advantages and limitations of each method.

Keywords: model compression, deep neural network, pruning, knowledge distillation, quantization, low-rank decomposition

Procedia PDF Downloads 38
20073 Sensory Ethnography and Interaction Design in Immersive Higher Education

Authors: Anna-Kaisa Sjolund

Abstract:

The doctoral thesis examines interaction design and sensory ethnography as tools to create immersive education environments. In recent years, there has been increasing interest and discussions among researchers and educators on immersive education like augmented reality tools, virtual glasses and the possibilities to utilize them in education at all levels. Using virtual devices as learning environments it is possible to create multisensory learning environments. Sensory ethnography in this study refers to the way of the senses consider the impact on the information dynamics in immersive learning environments. The past decade has seen the rapid development of virtual world research and virtual ethnography. Christine Hine's Virtual Ethnography offers an anthropological explanation of net behavior and communication change. Despite her groundbreaking work, time has changed the users’ communication style and brought new solutions to do ethnographical research. The virtual reality with all its new potential has come to the fore and considering all the senses. Movie and image have played an important role in cultural research for centuries, only the focus has changed in different times and in a different field of research. According to Karin Becker, the role of image in our society is information flow and she found two meanings what the research of visual culture is. The images and pictures are the artifacts of visual culture. Images can be viewed as a symbolic language that allows digital storytelling. Combining the sense of sight, but also the other senses, such as hear, touch, taste, smell, balance, the use of a virtual learning environment offers students a way to more easily absorb large amounts of information. It offers also for teachers’ different ways to produce study material. In this article using sensory ethnography as research tool approaches the core question. Sensory ethnography is used to describe information dynamics in immersive environment through interaction design. Immersive education environment is understood as three-dimensional, interactive learning environment, where the audiovisual aspects are central, but all senses can be taken into consideration. When designing learning environments or any digital service, interaction design is always needed. The question what is interaction design is justified, because there is no simple or consistent idea of what is the interaction design or how it can be used as a research method or whether it is only a description of practical actions. When discussing immersive learning environments or their construction, consideration should be given to interaction design and sensory ethnography.

Keywords: immersive education, sensory ethnography, interaction design, information dynamics

Procedia PDF Downloads 133
20072 The Study of Cost Accounting in S Company Based on TDABC

Authors: Heng Ma

Abstract:

Third-party warehousing logistics has an important role in the development of external logistics. At present, the third-party logistics in our country is still a new industry, the accounting system has not yet been established, the current financial accounting system of third-party warehousing logistics is mainly in the traditional way of thinking, and only able to provide the total cost information of the entire enterprise during the accounting period, unable to reflect operating indirect cost information. In order to solve the problem of third-party logistics industry cost information distortion, improve the level of logistics cost management, the paper combines theoretical research and case analysis method to reflect cost allocation by building third-party logistics costing model using Time-Driven Activity-Based Costing(TDABC), and takes S company as an example to account and control the warehousing logistics cost. Based on the idea of “Products consume activities and activities consume resources”, TDABC put time into the main cost driver and use time-consuming equation resources assigned to cost objects. In S company, the objects focuses on three warehouse, engaged with warehousing and transportation (the second warehouse, transport point) service. These three warehouse respectively including five departments, Business Unit, Production Unit, Settlement Center, Security Department and Equipment Division, the activities in these departments are classified by in-out of storage forecast, in-out of storage or transit and safekeeping work. By computing capacity cost rate, building the time-consuming equation, the paper calculates the final operation cost so as to reveal the real cost. The numerical analysis results show that the TDABC can accurately reflect the cost allocation of service customers and reveal the spare capacity cost of resource center, verifies the feasibility and validity of TDABC in third-party logistics industry cost accounting. It inspires enterprises focus on customer relationship management and reduces idle cost to strengthen the cost management of third-party logistics enterprises.

Keywords: third-party logistics enterprises, TDABC, cost management, S company

Procedia PDF Downloads 355
20071 Providing Health Promotion Information by Digital Animation to International Visitors in Japan: A Factorial Design View of Nurses

Authors: Mariko Nishikawa, Masaaki Yamanaka, Ayami Kondo

Abstract:

Background: International visitors to Japan are at a risk of travel-related illnesses or injury that could result in hospitalization in a country where the language and customs are unique. Over twelve million international visitors came to Japan in 2015, and more are expected leading up to the Tokyo Olympics. One aspect of this is the potentially greater demand on healthcare services by foreign visitors. Nurses who take care of them have anxieties and concerns of their knowledge of the Japanese health system. Objectives: An effective distribution of travel-health information is vital for facilitating care for international visitors. Our research investigates whether a four-minute digital animation (Mari Info Japan), designed and developed by the authors and applied to a survey of 513 nurses who take care of foreigners daily, could clarify travel health procedures, reduce anxieties, while making it enjoyable to learn. Methodology: Respondents to a survey were divided into two groups. The intervention group watched Mari Info Japan. The control group read a standard guidebook. The participants were requested to fill a two-page questionnaire called Mari Meter-X, STAI-Y in English and mark a face scale, before and after the interventions. The questions dealt with knowledge of health promotion, the Japanese healthcare system, cultural concerns, anxieties, and attitudes in Japan. Data were collected from an intervention group (n=83) and control group (n=83) of nurses in a hospital, Japan for foreigners from February to March, 2016. We analyzed the data using Text Mining Studio for open-ended questions and JMP for statistical significance. Results: We found that the intervention group displayed more confidence and less anxiety to take care of foreign patients compared to the control group. The intervention group indicated a greater comfort after watching the animation. However, both groups were most likely to be concerned about language, the cost of medical expenses, informed consent, and choice of hospital. Conclusions: From the viewpoint of nurses, the provision of travel-health information by digital animation to international visitors to Japan was more effective than traditional methods as it helped them be better prepared to treat travel-related diseases and injury among international visitors. This study was registered number UMIN000020867. Funding: Grant–in-Aid for Challenging Exploratory Research 2010-2012 & 2014-16, Japanese Government.

Keywords: digital animation, health promotion, international visitor, Japan, nurse

Procedia PDF Downloads 303
20070 Estimating View-Through Ad Attribution from User Surveys Using Convex Optimization

Authors: Yuhan Lin, Rohan Kekatpure, Cassidy Yeung

Abstract:

In Digital Marketing, robust quantification of View-through attribution (VTA) is necessary for evaluating channel effectiveness. VTA occurs when a product purchase is aided by an Ad but without an explicit click (e.g. a TV ad). A lack of a tracking mechanism makes VTA estimation challenging. Most prevalent VTA estimation techniques rely on post-purchase in-product user surveys. User surveys enable the calculation of channel multipliers, which are the ratio of the view-attributed to the click-attributed purchases of each marketing channel. Channel multipliers thus provide a way to estimate the unknown VTA for a channel from its known click attribution. In this work, we use Convex Optimization to compute channel multipliers in a way that enables a mathematical encoding of the expected channel behavior. Large fluctuations in channel attributions often result from overfitting the calculations to user surveys. Casting channel attribution as a Convex Optimization problem allows an introduction of constraints that limit such fluctuations. The result of our study is a distribution of channel multipliers across the entire marketing funnel, with important implications for marketing spend optimization. Our technique can be broadly applied to estimate Ad effectiveness in a privacy-centric world that increasingly limits user tracking.

Keywords: digital marketing, survey analysis, operational research, convex optimization, channel attribution

Procedia PDF Downloads 187
20069 An Assessment of Digital Platforms, Student Online Learning, Teaching Pedagogies, Research and Training at Kenya College of Accounting University

Authors: Jasmine Renner, Alice Njuguna

Abstract:

The booming technological revolution is driving a change in the mode of delivery systems especially for e-learning and distance learning in higher education. The report and findings of the study; an assessment of digital platforms, student online learning, teaching pedagogies, research and training at Kenya College of Accounting University (hereinafter 'KCA') was undertaken as a joint collaboration project between the Carnegie African Diaspora Fellowship and input from the staff, students and faculty at KCA University. The participants in this assessment/research met for selected days during a six-week period during which, one-one consultations, surveys, questionnaires, foci groups, training, and seminars were conducted to ascertain 'online learning and teaching, curriculum development, research and training at KCA.' The project was organized into an eight-week project workflow with each week culminating in project activities designed to assess digital online teaching and learning at KCA. The project also included the training of distance learning instructors at KCA and the evaluation of KCA’s distance platforms and programs. Additionally, through a curriculum audit and redesign, the project sought to enhance the curriculum development activities related to of distance learning at KCA. The findings of this assessment/research represent the systematic deliberate process of gathering, analyzing and using data collected from DL students, DL staff and lecturers and a librarian personnel in charge of online learning resources and access at KCA. We engaged in one-on-one interviews and discussions with staff, students, and faculty and collated the findings to inform practices that are effective in the ongoing design and development of eLearning earning at KCA University. Overall findings of the project led to the following recommendations. First, there is a need to address infrastructural challenges that led to poor internet connectivity for online learning, training needs and content development for faculty and staff. Second, there is a need to manage cultural impediments within KCA; for example fears of vital change from one platform to another for effectiveness and Institutional goodwill as a vital promise of effective online learning. Third, at a practical and short-term level, the following recommendations based on systematic findings of the research conducted were as follows: there is a need for the following to be adopted at KCA University to promote the effective adoption of online learning: a) an eLearning compatible faculty lab, b) revision of policy to include an eLearn strategy or strategic management, c) faculty and staff recognitions engaged in the process of training for the adoption and implementation of eLearning and d) adequate website resources on eLearning. The report and findings represent a comprehensive approach to a systematic assessment of online teaching and learning, research and training at KCA.

Keywords: e-learning, digital platforms, student online learning, online teaching pedagogies

Procedia PDF Downloads 183
20068 Adaptive Discharge Time Control for Battery Operation Time Enhancement

Authors: Jong-Bae Lee, Seongsoo Lee

Abstract:

This paper proposes an adaptive discharge time control method to balance cell voltages in alternating battery cell discharging method. In the alternating battery cell discharging method, battery cells are periodically discharged in turn. Recovery effect increases battery output voltage while the given battery cell rests without discharging, thus battery operation time of target system increases. However, voltage mismatch between cells leads two problems. First, voltage difference between cells induces inter-cell current with wasted power. Second, it degrades battery operation time, since system stops when any cell reaches to the minimum system operation voltage. To solve this problem, the proposed method adaptively controls cell discharge time to equalize both cell voltages. In the proposed method, battery operation time increases about 19%, while alternating battery cell discharging method shows about 7% improvement.

Keywords: battery, recovery effect, low-power, alternating battery cell discharging, adaptive discharge time control

Procedia PDF Downloads 348
20067 Applied Actuator Fault Accommodation in Flight Control Systems Using Fault Reconstruction Based FDD and SMC Reconfiguration

Authors: A. Ghodbane, M. Saad, J. F. Boland, C. Thibeault

Abstract:

Historically, actuators’ redundancy was used to deal with faults occurring suddenly in flight systems. This technique was generally expensive, time consuming and involves increased weight and space in the system. Therefore, nowadays, the on-line fault diagnosis of actuators and accommodation plays a major role in the design of avionic systems. These approaches, known as Fault Tolerant Flight Control systems (FTFCs) are able to adapt to such sudden faults while keeping avionics systems lighter and less expensive. In this paper, a (FTFC) system based on the Geometric Approach and a Reconfigurable Flight Control (RFC) are presented. The Geometric approach is used for cosmic ray fault reconstruction, while Sliding Mode Control (SMC) based on Lyapunov stability theory is designed for the reconfiguration of the controller in order to compensate the fault effect. Matlab®/Simulink® simulations are performed to illustrate the effectiveness and robustness of the proposed flight control system against actuators’ faulty signal caused by cosmic rays. The results demonstrate the successful real-time implementation of the proposed FTFC system on a non-linear 6 DOF aircraft model.

Keywords: actuators’ faults, fault detection and diagnosis, fault tolerant flight control, sliding mode control, geometric approach for fault reconstruction, Lyapunov stability

Procedia PDF Downloads 410
20066 Stability of Hybrid Stochastic Systems

Authors: Manlika Ratchagit

Abstract:

This paper is concerned with robust mean square stability of uncertain stochastic switched discrete time-delay systems. The system to be considered is subject to interval time-varying delays, which allows the delay to be a fast time-varying function and the lower bound is not restricted to zero. Based on the discrete Lyapunov functional, a switching rule for the robust mean square stability for the uncertain stochastic discrete time-delay system is designed via linear matrix inequalities. Finally, some examples are exploited to illustrate the effectiveness of the proposed schemes.

Keywords: robust mean square stability, discrete-time stochastic systems, hybrid systems, interval time-varying delays, Lyapunov functional, linear matrix inequalities

Procedia PDF Downloads 480
20065 Q-Efficient Solutions of Vector Optimization via Algebraic Concepts

Authors: Elham Kiyani

Abstract:

In this paper, we first introduce the concept of Q-efficient solutions in a real linear space not necessarily endowed with a topology, where Q is some nonempty (not necessarily convex) set. We also used the scalarization technique including the Gerstewitz function generated by a nonconvex set to characterize these Q-efficient solutions. The algebraic concepts of interior and closure are useful to study optimization problems without topology. Studying nonconvex vector optimization is valuable since topological interior is equal to algebraic interior for a convex cone. So, we use the algebraic concepts of interior and closure to define Q-weak efficient solutions and Q-Henig proper efficient solutions of set-valued optimization problems, where Q is not a convex cone. Optimization problems with set-valued maps have a wide range of applications, so it is expected that there will be a useful analytical tool in optimization theory for set-valued maps. These kind of optimization problems are closely related to stochastic programming, control theory, and economic theory. The paper focus on nonconvex problems, the results are obtained by assuming generalized non-convexity assumptions on the data of the problem. In convex problems, main mathematical tools are convex separation theorems, alternative theorems, and algebraic counterparts of some usual topological concepts, while in nonconvex problems, we need a nonconvex separation function. Thus, we consider the Gerstewitz function generated by a general set in a real linear space and re-examine its properties in the more general setting. A useful approach for solving a vector problem is to reduce it to a scalar problem. In general, scalarization means the replacement of a vector optimization problem by a suitable scalar problem which tends to be an optimization problem with a real valued objective function. The Gerstewitz function is well known and widely used in optimization as the basis of the scalarization. The essential properties of the Gerstewitz function, which are well known in the topological framework, are studied by using algebraic counterparts rather than the topological concepts of interior and closure. Therefore, properties of the Gerstewitz function, when it takes values just in a real linear space are studied, and we use it to characterize Q-efficient solutions of vector problems whose image space is not endowed with any particular topology. Therefore, we deal with a constrained vector optimization problem in a real linear space without assuming any topology, and also Q-weak efficient and Q-proper efficient solutions in the senses of Henig are defined. Moreover, by means of the Gerstewitz function, we provide some necessary and sufficient optimality conditions for set-valued vector optimization problems.

Keywords: algebraic interior, Gerstewitz function, vector closure, vector optimization

Procedia PDF Downloads 210
20064 Development and Evaluation of a Cognitive Behavioural Therapy Based Smartphone App for Low Moods and Anxiety

Authors: David Bakker, Nikki Rickard

Abstract:

Smartphone apps hold immense potential as mental health and wellbeing tools. Support can be made easily accessible and can be used in real-time while users are experiencing distress. Furthermore, data can be collected to enable machine learning and automated tailoring of support to users. While many apps have been developed for mental health purposes, few have adhered to evidence-based recommendations and even fewer have pursued experimental validation. This paper details the development and experimental evaluation of an app, MoodMission, that aims to provide support for low moods and anxiety, help prevent clinical depression and anxiety disorders, and serve as an adjunct to professional clinical supports. MoodMission was designed to deliver cognitive behavioural therapy for specifically reported problems in real-time, momentary interactions. Users report their low moods or anxious feelings to the app along with a subjective units of distress scale (SUDS) rating. MoodMission then provides a choice of 5-10 short, evidence-based mental health strategies called Missions. Users choose a Mission, complete it, and report their distress again. Automated tailoring, gamification, and in-built data collection for analysis of effectiveness was also included in the app’s design. The development process involved construction of an evidence-based behavioural plan, designing of the app, building and testing procedures, feedback-informed changes, and a public launch. A randomized controlled trial (RCT) was conducted comparing MoodMission to two other apps and a waitlist control condition. Participants completed measures of anxiety, depression, well-being, emotional self-awareness, coping self-efficacy and mental health literacy at the start of their app use and 30 days later. At the time of submission (November 2016) over 300 participants have participated in the RCT. Data analysis will begin in January 2017. At the time of this submission, MoodMission has over 4000 users. A repeated-measures ANOVA of 1390 completed Missions reveals that SUDS (0-10) ratings were significantly reduced between pre-Mission ratings (M=6.20, SD=2.39) and post-Mission ratings (M=4.93, SD=2.25), F(1,1389)=585.86, p < .001, np2=.30. This effect was consistent across both low moods and anxiety. Preliminary analyses of the data from the outcome measures surveys reveal improvements across mental health and wellbeing measures as a result of using the app over 30 days. This includes a significant increase in coping self-efficacy, F(1,22)=5.91, p=.024, np2=.21. Complete results from the RCT in which MoodMission was evaluated will be presented. Results will also be presented from the continuous outcome data being recorded by MoodMission. MoodMission was successfully developed and launched, and preliminary analysis suggest that it is an effective mental health and wellbeing tool. In addition to the clinical applications of MoodMission, the app holds promise as a research tool to conduct component analysis of psychological therapies and overcome restraints of laboratory based studies. The support provided by the app is discrete, tailored, evidence-based, and transcends barriers of stigma, geographic isolation, financial limitations, and low health literacy.

Keywords: anxiety, app, CBT, cognitive behavioural therapy, depression, eHealth, mission, mobile, mood, MoodMission

Procedia PDF Downloads 265
20063 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 99