Search results for: continuous time domain estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21607

Search results for: continuous time domain estimation

21097 Structure Domains Tuning Magnetic Anisotropy and Motivating Novel Electric Behaviors in LaCoO₃ Films

Authors: Dechao Meng, Yongqi Dong, Qiyuan Feng, Zhangzhang Cui, Xiang Hu, Haoliang Huang, Genhao Liang, Huanhua Wang, Hua Zhou, Hawoong Hong, Jinghua Guo, Qingyou Lu, Xiaofang Zhai, Yalin Lu

Abstract:

Great efforts have been taken to reveal the intrinsic origins of emerging ferromagnetism (FM) in strained LaCoO₃ (LCO) films. However, some macro magnetic performances of LCO are still not well understood and even controversial, such as magnetic anisotropy. Determining and understanding magnetic anisotropy might help to find the true causes of FM in turn. Perpendicular magnetic anisotropy (PMA) was the first time to be directly observed in high-quality LCO films with different thickness. The in-plane (IP) and out of plane (OOP) remnant magnetic moment ratio of 30 unit cell (u.c.) films is as large as 20. The easy axis lays in the OOP direction with an IP/OOP coercive field ratio of 10. What's more, the PMA could be simply tuned by changing the thickness. With the thickness increases, the IP/OOP magnetic moment ratio remarkably decrease with magnetic easy axis changing from OOP to IP. Such a huge and tunable PMA performance exhibit strong potentials in fundamental researches or applications. What causes PMA is the first concern. More OOP orbitals occupation may be one of the micro reasons of PMA. A cluster-like magnetic domain pattern was found in 30 u.c. with no obvious color contrasts, similar to that of LaAlO₃/SrTiO₃ films. And the nanosize domains could not be totally switched even at a large OOP magnetic field of 23 T. It indicates strong IP characters or none OOP magnetism of some clusters. The IP magnetic domains might influence the magnetic performance and help to form PMA. Meanwhile some possible nonmagnetic clusters might be the reason why the measured moments of LCO films are smaller than the calculated values 2 μB/Co, one of the biggest confusions in LCO films.What tunes PMA seems much more interesting. Totally different magnetic domain patterns were found in 180 u.c. films with cluster magnetic domains surrounded by < 110 > cross-hatch lines. These lines were regarded as structure domain walls (DWs) determined by 3D reciprocal space mapping (RSM). Two groups of in-plane features with fourfold symmetry were observed near the film diffraction peaks in (002) 3D-RSM. One is along < 110 > directions with a larger intensity, which is well match the lines on the surfaces. The other is much weaker and along < 100 > directions, which is from the normal lattice titling of films deposited on cubic substrates. The < 110 > domain features obtained from (103) and (113) 3D-RSMs exhibit similar evolution of the DWs percentages and magnetic behavior. Structure domains and domain walls are believed to tune PMA performances by transform more IP magnetic moments to OOP. Last but not the least, thick films with lots of structure domains exhibit different electrical transport behaviors. A metal-to-insulator transition (MIT) and an angular dependent negative magnetic resistivity were observed near 150 K, higher than FM transition temperature but similar to that of spin-orbital coupling related 1/4 order diffraction peaks.

Keywords: structure domain, magnetic anisotropy, magnetic domain, domain wall, 3D-RSM, strain

Procedia PDF Downloads 135
21096 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 414
21095 Active Surface Tracking Algorithm for All-Fiber Common-Path Fourier-Domain Optical Coherence Tomography

Authors: Bang Young Kim, Sang Hoon Park, Chul Gyu Song

Abstract:

A conventional optical coherence tomography (OCT) system has limited imaging depth, which is 1-2 mm, and suffers unwanted noise such as speckle noise. The motorized-stage-based OCT system, using a common-path Fourier-domain optical coherence tomography (CP-FD-OCT) configuration, provides enhanced imaging depth and less noise so that we can overcome these limitations. Using this OCT systems, OCT images were obtained from an onion, and their subsurface structure was observed. As a result, the images obtained using the developed motorized-stage-based system showed enhanced imaging depth than the conventional system, since it is real-time accurate depth tracking. Consequently, the developed CP-FD-OCT systems and algorithms have good potential for the further development of endoscopic OCT for microsurgery.

Keywords: common-path OCT, FD-OCT, OCT, tracking algorithm

Procedia PDF Downloads 360
21094 Artificial Neural Network and Satellite Derived Chlorophyll Indices for Estimation of Wheat Chlorophyll Content under Rainfed Condition

Authors: Muhammad Naveed Tahir, Wang Yingkuan, Huang Wenjiang, Raheel Osman

Abstract:

Numerous models used in prediction and decision-making process but most of them are linear in natural environment, and linear models reach their limitations with non-linearity in data. Therefore accurate estimation is difficult. Artificial Neural Networks (ANN) found extensive acceptance to address the modeling of the complex real world for the non-linear environment. ANN’s have more general and flexible functional forms than traditional statistical methods can effectively deal with. The link between information technology and agriculture will become more firm in the near future. Monitoring crop biophysical properties non-destructively can provide a rapid and accurate understanding of its response to various environmental influences. Crop chlorophyll content is an important indicator of crop health and therefore the estimation of crop yield. In recent years, remote sensing has been accepted as a robust tool for site-specific management by detecting crop parameters at both local and large scales. The present research combined the ANN model with satellite-derived chlorophyll indices from LANDSAT 8 imagery for predicting real-time wheat chlorophyll estimation. The cloud-free scenes of LANDSAT 8 were acquired (Feb-March 2016-17) at the same time when ground-truthing campaign was performed for chlorophyll estimation by using SPAD-502. Different vegetation indices were derived from LANDSAT 8 imagery using ERADAS Imagine (v.2014) software for chlorophyll determination. The vegetation indices were including Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Chlorophyll Absorbed Ratio Index (CARI), Modified Chlorophyll Absorbed Ratio Index (MCARI) and Transformed Chlorophyll Absorbed Ratio index (TCARI). For ANN modeling, MATLAB and SPSS (ANN) tools were used. Multilayer Perceptron (MLP) in MATLAB provided very satisfactory results. For training purpose of MLP 61.7% of the data, for validation purpose 28.3% of data and rest 10% of data were used to evaluate and validate the ANN model results. For error evaluation, sum of squares error and relative error were used. ANN model summery showed that sum of squares error of 10.786, the average overall relative error was .099. The MCARI and NDVI were revealed to be more sensitive indices for assessing wheat chlorophyll content with the highest coefficient of determination R²=0.93 and 0.90 respectively. The results suggested that use of high spatial resolution satellite imagery for the retrieval of crop chlorophyll content by using ANN model provides accurate, reliable assessment of crop health status at a larger scale which can help in managing crop nutrition requirement in real time.

Keywords: ANN, chlorophyll content, chlorophyll indices, satellite images, wheat

Procedia PDF Downloads 130
21093 Overview of Time, Resource and Cost Planning Techniques in Construction Management Research

Authors: R. Gupta, P. Jain, S. Das

Abstract:

One way to approach construction scheduling optimization problem is to focus on the individual aspects of planning, which can be broadly classified as time scheduling, crew and resource management, and cost control. During the last four decades, construction planning has seen a lot of research, but to date, no paper had attempted to summarize the literature available under important heads. This paper addresses each of aspects separately, and presents the findings of an in-depth literature of the various planning techniques. For techniques dealing with time scheduling, the authors have adopted a rough chronological documentation. For crew and resource management, classification has been done on the basis of the different steps involved in the resource planning process. For cost control, techniques dealing with both estimation of costs and the subsequent optimization of costs have been dealt with separately.

Keywords: construction planning techniques, time scheduling, resource planning, cost control

Procedia PDF Downloads 463
21092 Inverse Cauchy Problem of Doubly Connected Domains via Spectral Meshless Radial Point Interpolation

Authors: Elyas Shivanian

Abstract:

In this paper, the spectral meshless radial point interpolation (SMRPI) technique is applied to the Cauchy problems of two-dimensional elliptic PDEs in doubly connected domains. It is obtained the unknown data on the inner boundary of the domain while overspecified boundary data are imposed on the outer boundary of the domain by using the SMRPI. Shape functions, which are constructed through point interpolation method using the radial basis functions, help us to treat problem locally with the aim of high order convergence rate. In this way, localization in SMRPI can reduce the ill-conditioning for Cauchy problem. Furthermore, we improve previous results and it is revealed the SMRPI is more accurate and stable by adding strong perturbations.

Keywords: cauchy problem, doubly connected domain, radial basis function, shape function

Procedia PDF Downloads 263
21091 Dynamic Analysis of Viscoelastic Plates with Variable Thickness

Authors: Gülçin Tekin, Fethi Kadıoğlu

Abstract:

In this study, the dynamic analysis of viscoelastic plates with variable thickness is examined. The solutions of dynamic response of viscoelastic thin plates with variable thickness have been obtained by using the functional analysis method in the conjunction with the Gâteaux differential. The four-node serendipity element with four degrees of freedom such as deflection, bending, and twisting moments at each node is used. Additionally, boundary condition terms are included in the functional by using a systematic way. In viscoelastic modeling, Three-parameter Kelvin solid model is employed. The solutions obtained in the Laplace-Carson domain are transformed to the real time domain by using MDOP, Dubner & Abate, and Durbin inverse transform techniques. To test the performance of the proposed mixed finite element formulation, numerical examples are treated.

Keywords: dynamic analysis, inverse laplace transform techniques, mixed finite element formulation, viscoelastic plate with variable thickness

Procedia PDF Downloads 309
21090 The Development of Space-Time and Space-Number Associations: The Role of Non-Symbolic vs. Symbolic Representations

Authors: Letizia Maria Drammis, Maria Antonella Brandimonte

Abstract:

The idea that people use space representations to think about time and number received support from several lines of research. However, how these representations develop in children and then shape space-time and space-number mappings is still a debated issue. In the present study, 40 children (20 pre-schoolers and 20 elementary-school children) performed 4 main tasks, which required the use of more concrete (non-symbolic) or more abstract (symbolic) space-time and space-number associations. In the non-symbolic conditions, children were required to order pictures of everyday-life events occurring in a specific temporal order (Temporal sequences) and of quantities varying in numerosity (Numerical sequences). In the symbolic conditions, they were asked to perform the typical time-to-position and number-to-position tasks by mapping time-related words and numbers onto lines. Results showed that children performed reliably better in the non-symbolic Time conditions than the symbolic Time conditions, independently of age, whereas only pre-schoolers performed worse in the Number-to-position task (symbolic) as compared to the Numerical sequence (non-symbolic) task. In addition, only older children mapped time-related words onto space following the typical left-right orientation, pre-schoolers’ performance being somewhat mixed. In contrast, mapping numbers onto space showed a clear left-right orientation, independently of age. Overall, these results indicate a cross-domain difference in the way younger and older children process time and number, with time-related tasks being more difficult than number-related tasks only when space-time tasks require symbolic representations.

Keywords: space-time associations, space-number associations, orientation, children

Procedia PDF Downloads 318
21089 An Approach for Association Rules Ranking

Authors: Rihab Idoudi, Karim Saheb Ettabaa, Basel Solaiman, Kamel Hamrouni

Abstract:

Medical association rules induction is used to discover useful correlations between pertinent concepts from large medical databases. Nevertheless, ARs algorithms produce huge amount of delivered rules and do not guarantee the usefulness and interestingness of the generated knowledge. To overcome this drawback, we propose an ontology based interestingness measure for ARs ranking. According to domain expert, the goal of the use of ARs is to discover implicit relationships between items of different categories such as ‘clinical features and disorders’, ‘clinical features and radiological observations’, etc. That’s to say, the itemsets which are composed of ‘similar’ items are uninteresting. Therefore, the dissimilarity between the rule’s items can be used to judge the interestingness of association rules; the more different are the items, the more interesting the rule is. In this paper, we design a distinct approach for ranking semantically interesting association rules involving the use of an ontology knowledge mining approach. The basic idea is to organize the ontology’s concepts into a hierarchical structure of conceptual clusters of targeted subjects, where each cluster encapsulates ‘similar’ concepts suggesting a specific category of the domain knowledge. The interestingness of association rules is, then, defined as the dissimilarity between corresponding clusters. That is to say, the further are the clusters of the items in the AR, the more interesting the rule is. We apply the method in our domain of interest – mammographic domain- using an existing mammographic ontology called Mammo with the goal of deriving interesting rules from past experiences, to discover implicit relationships between concepts modeling the domain.

Keywords: association rule, conceptual clusters, interestingness measures, ontology knowledge mining, ranking

Procedia PDF Downloads 303
21088 Ontologies for Social Media Digital Evidence

Authors: Edlira Kalemi, Sule Yildirim-Yayilgan

Abstract:

Online Social Networks (OSNs) are nowadays being used widely and intensively for crime investigation and prevention activities. As they provide a lot of information they are used by the law enforcement and intelligence. An extensive review on existing solutions and models for collecting intelligence from this source of information and making use of it for solving crimes has been presented in this article. The main focus is on smart solutions and models where ontologies have been used as the main approach for representing criminal domain knowledge. A framework for a prototype ontology named SC-Ont will be described. This defines terms of the criminal domain ontology and the relations between them. The terms and the relations are extracted during both this review and the discussions carried out with domain experts. The development of SC-Ont is still ongoing work, where in this paper, we report mainly on the motivation for using smart ontology models and the possible benefits of using them for solving crimes.

Keywords: criminal digital evidence, social media, ontologies, reasoning

Procedia PDF Downloads 368
21087 Characterization of the Catalytic and Structural Roles of the Human Hexokinase 2 in Cancer Progression

Authors: Mir Hussain Nawaz, Lyudmila Nedyalkova, Haizhong Zhu, Wael M. Rabeh

Abstract:

In this study, we aim to biochemically and structurally characterize the interactions of human HK2 with the mitochondria in addition to the role of its N-terminal domain in catalysis and stability of the full-length enzyme. Here, we solved the crystal structure of human HK2 in complex with glucose and glucose-6-phosphate (PDB code: 2NZT), where it is a homodimer with catalytically active N- and C-terminal domains linked by a seven-turn α-helix. Different from the inactive N-terminal domains of isozymes 1 and 3, the N- domain of HK2 not only capable to catalyze a reaction but it is responsible for the thermodynamic stabilizes of the full-length enzyme. Deletion of first α-helix of the N-domain that binds to the mitochondria altered the stability and catalytic activity of the full-length HK2. In addition, we found the linker helix between the N- and C-terminal domains to play an important role in controlling the catalytic activity of the N-terminal domain. HK2 is a major step in the regulation of glucose metabolism in cancer making it an ideal target for the development of new anticancer therapeutics. Characterizing the structural and molecular mechanisms of human HK2 and its role in cancer metabolism will accelerate the design and development of new cancer therapeutics that are safe and cancer specific.

Keywords: cancer metabolism, enzymology, drug discovery, protein stability

Procedia PDF Downloads 241
21086 A Hybrid Watermarking Model Based on Frequency of Occurrence

Authors: Hamza A. A. Al-Sewadi, Adnan H. M. Al-Helali, Samaa A. K. Khamis

Abstract:

Ownership proofs of multimedia such as text, image, audio or video files can be achieved by the burial of watermark is them. It is achieved by introducing modifications into these files that are imperceptible to the human senses but easily recoverable by a computer program. These modifications would be in the time domain or frequency domain or both. This paper presents a procedure for watermarking by mixing amplitude modulation with frequency transformation histogram; namely a specific value is used to modulate the intensity component Y of the YIQ components of the carrier image. This scheme is referred to as histogram embedding technique (HET). Results comparison with those of other techniques such as discrete wavelet transform (DWT), discrete cosine transform (DCT) and singular value decomposition (SVD) have shown an enhance efficiency in terms of ease and performance. It has manifested a good degree of robustness against various environment effects such as resizing, rotation and different kinds of noise. This method would prove very useful technique for copyright protection and ownership judgment.

Keywords: authentication, copyright protection, information hiding, ownership, watermarking

Procedia PDF Downloads 546
21085 Development of Fuzzy Logic Control Ontology for E-Learning

Authors: Muhammad Sollehhuddin A. Jalil, Mohd Ibrahim Shapiai, Rubiyah Yusof

Abstract:

Nowadays, ontology is common in many areas like artificial intelligence, bioinformatics, e-commerce, education and many more. Ontology is one of the focus areas in the field of Information Retrieval. The purpose of an ontology is to describe a conceptual representation of concepts and their relationships within a particular domain. In other words, ontology provides a common vocabulary for anyone who needs to share information in the domain. There are several ontology domains in various fields including engineering and non-engineering knowledge. However, there are only a few available ontology for engineering knowledge. Fuzzy logic as engineering knowledge is still not available as ontology domain. In general, fuzzy logic requires step-by-step guidelines and instructions of lab experiments. In this study, we presented domain ontology for Fuzzy Logic Control (FLC) knowledge. We give Table of Content (ToC) with middle strategy based on the Uschold and King method to develop FLC ontology. The proposed framework is developed using Protégé as the ontology tool. The Protégé’s ontology reasoner, known as the Pellet reasoner is then used to validate the presented framework. The presented framework offers better performance based on consistency and classification parameter index. In general, this ontology can provide a platform to anyone who needs to understand FLC knowledge.

Keywords: engineering knowledge, fuzzy logic control ontology, ontology development, table of content

Procedia PDF Downloads 277
21084 Frequency of Occurrence Hybrid Watermarking Scheme

Authors: Hamza A. Ali, Adnan H. M. Al-Helali

Abstract:

Generally, a watermark is information that identifies the ownership of multimedia (text, image, audio or video files). It is achieved by introducing modifications into these files that are imperceptible to the human senses but easily recoverable by a computer program. These modifications are done according to a secret key in a descriptive model that would be either in the time domain or frequency domain or both. This paper presents a procedure for watermarking by mixing amplitude modulation with frequency transformation histogram; namely a specific value is used to modulate the intensity component Y of the YIQ components of the carrier image. This scheme is referred to as histogram embedding technique (HET). Results comparison with those of other techniques such as discrete wavelet transform (DWT), discrete cosine transform (DCT) and singular value decomposition (SVD) have shown an enhance efficiency in terms of ease and performance. It has manifested a good degree of robustness against various environment effects such as resizing, rotation and different kinds of noise. This method would prove very useful technique for copyright protection and ownership judgment.

Keywords: watermarking, ownership, copyright protection, steganography, information hiding, authentication

Procedia PDF Downloads 354
21083 Remote Sensing and GIS Integration for Paddy Production Estimation in Bali Province, Indonesia

Authors: Sarono, Hamim Zaky Hadibasyir, dan Ridho Kurniawan

Abstract:

Estimation of paddy production is one of the areas that can be examined using the techniques of remote sensing and geographic information systems (GIS) in the field of agriculture. The purpose of this research is to know the amount of the paddy production estimation and how remote sensing and geographic information systems (GIS) are able to perform analysis of paddy production estimation in Tegalallang and Payangan Sub district, Bali Province, Indonesia. The method used is the method of land suitability. This method associates a physical parameters which are to be embodied in the smallest unit of a mapping that represents a mapping unit in a particular field and connecting with its field productivity. Analysis of estimated production using standard land suitability from FAO using matching technique. The parameters used to create the land unit is slope (FAO), climate classification (Oldeman), landform (Prapto Suharsono), and soil type. Land use map consist of paddy and non paddy field information obtained from Geo-eye 1 imagery using visual interpretation technique. Landsat image of the Data used for the interpretation of the landform, the classification of the slopes obtained from high point identification with method of interpolation spline, whereas climate data, soil, use secondary data originating from institutions-related institutions. The results of this research indicate Tegallalang and Payangan Districts in known wetland suitability consists of S1 (very suitable) covering an area of 2884,7 ha with the productivity of 5 tons/ha and S2 (suitable) covering an area of 482,9 ha with the productivity of 3 tons/ha. The sum of paddy production estimation as a results in both districts are 31.744, 3 tons in one year.

Keywords: production estimation, paddy, remote sensing, geography information system, land suitability

Procedia PDF Downloads 318
21082 Adaptive Multipath Mitigation Acquisition Approach for Global Positioning System Software Receivers

Authors: Animut Meseret Simachew

Abstract:

Parallel Code Phase Search Acquisition (PCSA) Algorithm has been considered as a promising method in GPS software receivers for detection and estimation of the accurate correlation peak between the received Global Positioning System (GPS) signal and locally generated replicas. GPS signal acquisition in highly dense multipath environments is the main research challenge. In this work, we proposed a robust variable step-size (RVSS) PCSA algorithm based on fast frequency transform (FFT) filtering technique to mitigate short time delay multipath signals. Simulation results reveal the effectiveness of the proposed algorithm over the conventional PCSA algorithm. The proposed RVSS-PCSA algorithm equalizes the received carrier wiped-off signal with locally generated C/A code.

Keywords: adaptive PCSA, detection and estimation, GPS signal acquisition, GPS software receiver

Procedia PDF Downloads 100
21081 Public and Private Domains: Contradictions and Covenants in Evolution of Game Policy

Authors: Mingzhu Lyu, Runlei Ren, Xinyu Dai, Jiaxuan Pi, Kanghua Li

Abstract:

The study of video game policy in China has been divided into two branches: "pedagogy" and "game industry". The binary perspective of policy reveals the "contradictory" side of policy performance. Based on this suspicion, this paper constructs a three-dimensional sequence of time, content and institutions of game policy, and establishes the "contradictory" aspects of policy performance between 1949 and 2019. A central-level database of game policies, clarifying that our game policies follow a shift from reactive response to proactive guidance, stigmatization and de-stigmatization, the evolutionary logic. The study found that the central government has always maintained a strict requirement and prudent guidance for game policy, and the deep contradictions in game policy stem from the essential conflict between the natural amusement of games and the seriousness of the educational system, and the Chinese government's use of the understanding of the public and private domains and the Managing of the conflict.

Keywords: game industry, gaming policy, public domain, private domain

Procedia PDF Downloads 125
21080 A Tool for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: digital information management, file format, endangerment analysis, fuzzy models

Procedia PDF Downloads 388
21079 Infinite Impulse Response Digital Filters Design

Authors: Phuoc Si Nguyen

Abstract:

Infinite impulse response (IIR) filters can be designed from an analogue low pass prototype by using frequency transformation in the s-domain and bilinear z-transformation with pre-warping frequency; this method is known as frequency transformation from the s-domain to the z-domain. This paper will introduce a new method to transform an IIR digital filter to another type of IIR digital filter (low pass, high pass, band pass, band stop or narrow band) using a technique based on inverse bilinear z-transformation and inverse matrices. First, a matrix equation is derived from inverse bilinear z-transformation and Pascal’s triangle. This Low Pass Digital to Digital Filter Pascal Matrix Equation is used to transform a low pass digital filter to other digital filter types. From this equation and the inverse matrix, a Digital to Digital Filter Pascal Matrix Equation can be derived that is able to transform any IIR digital filter. This paper will also introduce some specific matrices to replace the inverse matrix, which is difficult to determine due to the larger size of the matrix in the current method. This will make computing and hand calculation easier when transforming from one IIR digital filter to another in the digital domain.

Keywords: bilinear z-transformation, frequency transformation, inverse bilinear z-transformation, IIR digital filters

Procedia PDF Downloads 399
21078 Estimation of Opc, Fly Ash and Slag Contents in Blended and Composite Cements by Selective Dissolution Method

Authors: Suresh Palla

Abstract:

This research paper presents the results of the study on the estimation of fly ash, slag and cement contents in blended and composite cements by novel selective dissolution method. Types of cement samples investigated include OPC with fly ash as performance improver, OPC with slag as performance improver, PPC, PSC and Composite cement confirming to respective Indian Standards. Slag and OPC contents in PSC were estimated by selectively dissolving OPC in stage 1 and selectively dissolving slag in stage 2. In the case of composite cement sample, the percentage of cement, slag and fly ash were estimated systematically by selective dissolution of cement, slag and fly ash in three stages. In the first stage, cement dissolved and separated by leaving the residue of slag and fly ash, designated as R1. The second stage involves gravimetric estimation of fractions of OPC, residue and selective dissolution of fly ash and slag contents. Fly ash content, R2 was estimated through gravimetric analysis. Thereafter, the difference between the R1 and R2 is considered as slag content. The obtained results of cement, fly ash and slag using selective dissolution method showed 10% of standard deviation with the corresponding percentage of respective constituents. The results suggest that this novel selective dissolution method can be successfully used for estimation of OPC and SCMs contents in different types of cements.

Keywords: selective dissolution method , fly ash, ggbfs slag, edta

Procedia PDF Downloads 136
21077 Realization of Hybrid Beams Inertial Amplifier

Authors: Somya Ranjan Patro, Abhigna Bhatt, Arnab Banerjee

Abstract:

Inertial amplifier has recently gained increasing attention as a new mechanism for vibration control of structures. Currently, theoretical investigations are undertaken by researchers to reveal its fundamentals and to understand its underline principles in altering the structural response of structures against dynamic loadings. This paper investigates experimental and analytical studies on the dynamic characteristics of hybrid beam inertial amplifier (HBIA). The analytical formulation of the HBIA has been derived by implementing the spectral element method and rigid body dynamics. This formulation gives the relation between dynamic force and the response of the structure in the frequency domain. Further, for validation of the proposed HBIA, the experiments have been performed. The experimental setup consists of a 3D printed HBIA of polylactic acid (PLA) material screwed at the base plate of the shaker system. Two numbers of accelerometers are used to study the response, one at the base plate of the shaker second one placed at the top of the inertial amplifier. A force transducer is also placed in between the base plate and the inertial amplifier to calculate the total amount of load transferred from the base plate to the inertial amplifier. The obtained time domain response from the accelerometers have been converted into the frequency domain using the Fast Fourier Transform (FFT) algorithm. The experimental transmittance values are successfully validated with the analytical results, providing us essential confidence in our proposed methodology.

Keywords: inertial amplifier, fast fourier transform, natural frequencies, polylactic acid, transmittance, vibration absorbers

Procedia PDF Downloads 81
21076 Transesterification of Refined Palm Oil to Biodiesel in a Continuous Spinning Disc Reactor

Authors: Weerinda Appamana, Jirapong Keawkoon, Yamonporn Pacthong, Jirathiti Chitsanguansuk, Yanyong Sookklay

Abstract:

In the present work, spinning disc reactor has been used for the intensification of synthesis of biodiesel from refined palm oil (RPO) based on the transesterification reaction. Experiments have been performed using different spinning disc surface and under varying operating parameters viz. molar ratio of oil to methanol (over the range of 1:4.5–1:9), rotational speed (over the range of 500–2,000 rpm), total flow rate (over the range of 260-520 ml/min), and KOH catalyst loading of 1.50% by weight of oil. Maximum FAME (fatty acid methyl esters) yield (97.5 %) of biodiesel from RPO was obtained at oil to methanol ratio of 1:6, temperature of 60 °C, and rotational speed of 1500 rpm and flow rate of 520 mL/min using groove disc at KOH catalyst loading of 1.5 wt%. Also, higher yield efficiency (biodiesel produced per unit energy consumed) was obtained for using the spinning disc reactor based approach as compared to the ultrasound hydrodynamic cavitation and conventional mechanical stirrer reactors. It obviously offers a significant reduction in the reaction time for the transesterification, especially when compared with the reaction time of 90 minutes required for the conventional mechanical stirrer. It can be concluded that the spinning disk reactor is a promising alternative method for continuous biodiesel production.

Keywords: spinning disc reactor, biodiesel, process intensification, yield efficiency

Procedia PDF Downloads 140
21075 Polynomially Adjusted Bivariate Density Estimates Based on the Saddlepoint Approximation

Authors: S. B. Provost, Susan Sheng

Abstract:

An alternative bivariate density estimation methodology is introduced in this presentation. The proposed approach involves estimating the density function associated with the marginal distribution of each of the two variables by means of the saddlepoint approximation technique and applying a bivariate polynomial adjustment to the product of these density estimates. Since the saddlepoint approximation is utilized in the context of density estimation, such estimates are determined from empirical cumulant-generating functions. In the univariate case, the saddlepoint density estimate is itself adjusted by a polynomial. Given a set of observations, the coefficients of the polynomial adjustments are obtained from the sample moments. Several illustrative applications of the proposed methodology shall be presented. Since this approach relies essentially on a determinate number of sample moments, it is particularly well suited for modeling massive data sets.

Keywords: density estimation, empirical cumulant-generating function, moments, saddlepoint approximation

Procedia PDF Downloads 258
21074 Vaporization of a Single N-Pentane Liquid Drop in a Flowing Immiscible Liquid Media

Authors: Hameed B. Mahood, Ali Sh. Baqir

Abstract:

Vaporization of a single n-pentane drop in a direct contact with another flowing immiscible liquid (warm water) has been experimentally investigated. The experiments were carried out utilising a cylindrical Perspex tube of diameter 10 cm and height and 150 cm. Saturated liquid n-pentane and warm water at 45oC were used as the dispersed and continuous phases, respectively. Photron FASTCAM SA 1.1high speed camera (75,000f/s) with software V. 321 was implemented during the experiments. Five different continuous phase flow rates (warm water) (10, 20, 30, 40, and 46 L⁄h) were used in the study. The results indicated that the increase of the continuous phase (warm water) flow rate results in increasing of the drop/bubble diameter.

Keywords: drop evaporation, direct contact heat transfer, drop/bubble growth, experimental technique

Procedia PDF Downloads 329
21073 Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding

Authors: Seongsoo Lee

Abstract:

Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.

Keywords: motion estimation, test zone search, high efficiency video coding, processing element, optimization

Procedia PDF Downloads 339
21072 Human Posture Estimation Based on Multiple Viewpoints

Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo

Abstract:

This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.

Keywords: multi-view, pose estimation, ST-GCN, joint fusion

Procedia PDF Downloads 46
21071 Low Complexity Carrier Frequency Offset Estimation for Cooperative Orthogonal Frequency Division Multiplexing Communication Systems without Cyclic Prefix

Authors: Tsui-Tsai Lin

Abstract:

Cooperative orthogonal frequency division multiplexing (OFDM) transmission, which possesses the advantages of better connectivity, expanded coverage, and resistance to frequency selective fading, has been a more powerful solution for the physical layer in wireless communications. However, such a hybrid scheme suffers from the carrier frequency offset (CFO) effects inherited from the OFDM-based systems, which lead to a significant degradation in performance. In addition, insertion of a cyclic prefix (CP) at each symbol block head for combating inter-symbol interference will lead to a reduction in spectral efficiency. The design on the CFO estimation for the cooperative OFDM system without CP is a suspended problem. This motivates us to develop a low complexity CFO estimator for the cooperative OFDM decode-and-forward (DF) communication system without CP over the multipath fading channel. Especially, using a block-type pilot, the CFO estimation is first derived in accordance with the least square criterion. A reliable performance can be obtained through an exhaustive two-dimensional (2D) search with a penalty of heavy computational complexity. As a remedy, an alternative solution realized with an iteration approach is proposed for the CFO estimation. In contrast to the 2D-search estimator, the iterative method enjoys the advantage of the substantially reduced implementation complexity without sacrificing the estimate performance. Computer simulations have been presented to demonstrate the efficacy of the proposed CFO estimation.

Keywords: cooperative transmission, orthogonal frequency division multiplexing (OFDM), carrier frequency offset, iteration

Procedia PDF Downloads 251
21070 Artificial Neural Network Model Based Setup Period Estimation for Polymer Cutting

Authors: Zsolt János Viharos, Krisztián Balázs Kis, Imre Paniti, Gábor Belső, Péter Németh, János Farkas

Abstract:

The paper presents the results and industrial applications in the production setup period estimation based on industrial data inherited from the field of polymer cutting. The literature of polymer cutting is very limited considering the number of publications. The first polymer cutting machine is known since the second half of the 20th century; however, the production of polymer parts with this kind of technology is still a challenging research topic. The products of the applying industrial partner must met high technical requirements, as they are used in medical, measurement instrumentation and painting industry branches. Typically, 20% of these parts are new work, which means every five years almost the entire product portfolio is replaced in their low series manufacturing environment. Consequently, it requires a flexible production system, where the estimation of the frequent setup periods' lengths is one of the key success factors. In the investigation, several (input) parameters have been studied and grouped to create an adequate training information set for an artificial neural network as a base for the estimation of the individual setup periods. In the first group, product information is collected such as the product name and number of items. The second group contains material data like material type and colour. In the third group, surface quality and tolerance information are collected including the finest surface and tightest (or narrowest) tolerance. The fourth group contains the setup data like machine type and work shift. One source of these parameters is the Manufacturing Execution System (MES) but some data were also collected from Computer Aided Design (CAD) drawings. The number of the applied tools is one of the key factors on which the industrial partners’ estimations were based previously. The artificial neural network model was trained on several thousands of real industrial data. The mean estimation accuracy of the setup periods' lengths was improved by 30%, and in the same time the deviation of the prognosis was also improved by 50%. Furthermore, an investigation on the mentioned parameter groups considering the manufacturing order was also researched. The paper also highlights the manufacturing introduction experiences and further improvements of the proposed methods, both on the shop floor and on the quotation preparation fields. Every week more than 100 real industrial setup events are given and the related data are collected.

Keywords: artificial neural network, low series manufacturing, polymer cutting, setup period estimation

Procedia PDF Downloads 227
21069 Parameter Estimation of False Dynamic EIV Model with Additive Uncertainty

Authors: Dalvinder Kaur Mangal

Abstract:

For the past decade, noise corrupted output measurements have been a fundamental research problem to be investigated. On the other hand, the estimation of the parameters for linear dynamic systems when also the input is affected by noise is recognized as more difficult problem which only recently has received increasing attention. Representations where errors or measurement noises/disturbances are present on both the inputs and outputs are usually called errors-in-variables (EIV) models. These disturbances may also have additive effects which are also considered in this paper. Parameter estimation of false EIV problem using equation error, output error and iterative prefiltering identification schemes with and without additive uncertainty, when only the output observation is corrupted by noise has been dealt in this paper. The comparative study of these three schemes has also been carried out.

Keywords: errors-in-variable (EIV), false EIV, equation error, output error, iterative prefiltering, Gaussian noise

Procedia PDF Downloads 464
21068 Digitalisation of the Railway Industry: Recent Advances in the Field of Dialogue Systems: Systematic Review

Authors: Andrei Nosov

Abstract:

This paper discusses the development directions of dialogue systems within the digitalisation of the railway industry, where technologies based on conversational AI are already potentially applied or will be applied. Conversational AI is one of the popular natural language processing (NLP) tasks, as it has great prospects for real-world applications today. At the same time, it is a challenging task as it involves many areas of NLP based on complex computations and deep insights from linguistics and psychology. In this review, we focus on dialogue systems and their implementation in the railway domain. We comprehensively review the state-of-the-art research results on dialogue systems and analyse them from three perspectives: type of problem to be solved, type of model, and type of system. In particular, from the perspective of the type of tasks to be solved, we discuss characteristics and applications. This will help to understand how to prioritise tasks. In terms of the type of models, we give an overview that will allow researchers to become familiar with how to apply them in dialogue systems. By analysing the types of dialogue systems, we propose an unconventional approach in contrast to colleagues who traditionally contrast goal-oriented dialogue systems with open-domain systems. Our view focuses on considering retrieval and generative approaches. Furthermore, the work comprehensively presents evaluation methods and datasets for dialogue systems in the railway domain to pave the way for future research. Finally, some possible directions for future research are identified based on recent research results.

Keywords: digitalisation, railway, dialogue systems, conversational AI, natural language processing, natural language understanding, natural language generation

Procedia PDF Downloads 43