Search results for: panel data method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38546

Search results for: panel data method

38156 Sensor Registration in Multi-Static Sonar Fusion Detection

Authors: Longxiang Guo, Haoyan Hao, Xueli Sheng, Hanjun Yu, Jingwei Yin

Abstract:

In order to prevent target splitting and ensure the accuracy of fusion, system error registration is an important step in multi-static sonar fusion detection system. To eliminate the inherent system errors including distance error and angle error of each sonar in detection, this paper uses offline estimation method for error registration. Suppose several sonars from different platforms work together to detect a target. The target position detected by each sonar is based on each sonar’s own reference coordinate system. Based on the two-dimensional stereo projection method, this paper uses real-time quality control (RTQC) method and least squares (LS) method to estimate sensor biases. The RTQC method takes the average value of each sonar’s data as the observation value and the LS method makes the least square processing of each sonar’s data to get the observation value. In the underwater acoustic environment, matlab simulation is carried out and the simulation results show that both algorithms can estimate the distance and angle error of sonar system. The performance of the two algorithms is also compared through the root mean square error and the influence of measurement noise on registration accuracy is explored by simulation. The system error convergence of RTQC method is rapid, but the distribution of targets has a serious impact on its performance. LS method can not be affected by target distribution, but the increase of random noise will slow down the convergence rate. LS method is an improvement of RTQC method, which is widely used in two-dimensional registration. The improved method can be used for underwater multi-target detection registration.

Keywords: data fusion, multi-static sonar detection, offline estimation, sensor registration problem

Procedia PDF Downloads 169
38155 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors

Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui

Abstract:

Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.

Keywords: data-driven method, process control, anomaly detection, dimensionality reduction

Procedia PDF Downloads 299
38154 The Effects of Corporate Governance on Firm’s Financial Performance: A Study of Family and Non-family Owned Firms in Pakistan

Authors: Saad Bin Nasir

Abstract:

This research will examine the impact of corporate governance on firm performance in family and non-family owned firms in Pakistan. For the purpose of this research, corporate governance mechanisms which included are board size, board composition, leadership structure, board meetings are taken as independent variable and firm performance taken as dependent variable and it will be measured with return on asset and return on equity. Firm size and firm’s age will be taken as control variables. Secondary data will collect from audited annul reports of companies and panel data regression model will applied, to check the impact of corporate governance on firm performance.

Keywords: board size, board composition, Leadership Structure, board meetings, firm performance, family and non-family owned firms

Procedia PDF Downloads 373
38153 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 141
38152 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites

Authors: Che-Wei Lee, Bay-Erl Lai

Abstract:

A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.

Keywords: steganography, data hiding, secret authentication, secret sharing

Procedia PDF Downloads 243
38151 Intrusion Detection System Using Linear Discriminant Analysis

Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou

Abstract:

Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.

Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99

Procedia PDF Downloads 226
38150 Bank Internal Controls and Credit Risk in Europe: A Quantitative Measurement Approach

Authors: Ellis Kofi Akwaa-Sekyi, Jordi Moreno Gené

Abstract:

Managerial actions which negatively profile banks and impair corporate reputation are addressed through effective internal control systems. Disregard for acceptable standards and procedures for granting credit have affected bank loan portfolios and could be cited for the crises in some European countries. The study intends to determine the effectiveness of internal control systems, investigate whether perceived agency problems exist on the part of board members and to establish the relationship between internal controls and credit risk among listed banks in the European Union. Drawing theoretical support from the behavioural compliance and agency theories, about seventeen internal control variables (drawn from the revised COSO framework), bank-specific, country, stock market and macro-economic variables will be involved in the study. A purely quantitative approach will be employed to model internal control variables covering the control environment, risk management, control activities, information and communication and monitoring. Panel data from 2005-2014 on listed banks from 28 European Union countries will be used for the study. Hypotheses will be tested and the Generalized Least Squares (GLS) regression will be run to establish the relationship between dependent and independent variables. The Hausman test will be used to select whether random or fixed effect model will be used. It is expected that listed banks will have sound internal control systems but their effectiveness cannot be confirmed. A perceived agency problem on the part of the board of directors is expected to be confirmed. The study expects significant effect of internal controls on credit risk. The study will uncover another perspective of internal controls as not only an operational risk issue but credit risk too. Banks will be cautious that observing effective internal control systems is an ethical and socially responsible act since the collapse (crisis) of financial institutions as a result of excessive default is a major contagion. This study deviates from the usual primary data approach to measuring internal control variables and rather models internal control variables in a quantitative approach for the panel data. Thus a grey area in approaching the revised COSO framework for internal controls is opened for further research. Most bank failures and crises could be averted if effective internal control systems are religiously adhered to.

Keywords: agency theory, credit risk, internal controls, revised COSO framework

Procedia PDF Downloads 316
38149 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 153
38148 Nazca: A Context-Based Matching Method for Searching Heterogeneous Structures

Authors: Karine B. de Oliveira, Carina F. Dorneles

Abstract:

The structure level matching is the problem of combining elements of a structure, which can be represented as entities, classes, XML elements, web forms, and so on. This is a challenge due to large number of distinct representations of semantically similar structures. This paper describes a structure-based matching method applied to search for different representations in data sources, considering the similarity between elements of two structures and the data source context. Using real data sources, we have conducted an experimental study comparing our approach with our baseline implementation and with another important schema matching approach. We demonstrate that our proposal reaches higher precision than the baseline.

Keywords: context, data source, index, matching, search, similarity, structure

Procedia PDF Downloads 364
38147 Solutions to Reduce CO2 Emissions in Autonomous Robotics

Authors: Antoni Grau, Yolanda Bolea, Alberto Sanfeliu

Abstract:

Mobile robots can be used in many different applications, including mapping, search, rescue, reconnaissance, hazard detection, and carpet cleaning, exploration, etc. However, they are limited due to their reliance on traditional energy sources such as electricity and oil which cannot always provide a convenient energy source in all situations. In an ever more eco-conscious world, solar energy offers the most environmentally clean option of all energy sources. Electricity presents threats of pollution resulting from its production process, and oil poses a huge threat to the environment. Not only does it pose harm by the toxic emissions (for instance CO2 emissions), it produces the combustion process necessary to produce energy, but there is the ever present risk of oil spillages and damages to ecosystems. Solar energy can help to mitigate carbon emissions by replacing more carbon intensive sources of heat and power. The challenge of this work is to propose the design and the implementation of electric battery recharge stations. Those recharge docks are based on the use of renewable energy such as solar energy (with photovoltaic panels) with the object to reduce the CO2 emissions. In this paper, a comparative study of the CO2 emission productions (from the use of different energy sources: natural gas, gas oil, fuel and solar panels) in the charging process of the Segway PT batteries is carried out. To make the study with solar energy, a photovoltaic panel, and a Buck-Boost DC/DC block has been used. Specifically, the STP005S-12/Db solar panel has been used to carry out our experiments. This module is a 5Wp-photovoltaic (PV) module, configured with 36 monocrystalline cells serially connected. With those elements, a battery recharge station is made to recharge the robot batteries. For the energy storage DC/DC block, a series of ultracapacitors have been used. Due to the variation of the PV panel with the temperature and irradiation, and the non-integer behavior of the ultracapacitors as well as the non-linearities of the whole system, authors have been used a fractional control method to achieve that solar panels supply the maximum allowed power to recharge the robots in the lesser time. Greenhouse gas emissions for production of electricity vary due to regional differences in source fuel. The impact of an energy technology on the climate can be characterised by its carbon emission intensity, a measure of the amount of CO2, or CO2 equivalent emitted by unit of energy generated. In our work, the coal is the fossil energy more hazardous, providing a 53% more of gas emissions than natural gas and a 30% more than fuel. Moreover, it is remarkable that existing fossil fuel technologies produce high carbon emission intensity through the combustion of carbon-rich fuels, whilst renewable technologies such as solar produce little or no emissions during operation, but may incur emissions during manufacture. The solar energy thus can help to mitigate carbon emissions.

Keywords: autonomous robots, CO2 emissions, DC/DC buck-boost, solar energy

Procedia PDF Downloads 422
38146 Wetting Treatement: Comparative Overview: Case of Polypropylene Top Sheet Layer on Disposable Baby Diaper

Authors: Tilouche Rahma, Sayeb Soumaya, Ben Hassen Mohamed

Abstract:

The wettability of materials is a very important aspect of surface science, it presents a key factor providing the best characteristic of product, especially in hygienic field. Hydrophobic polypropylene is used as nonwoven topsheet in disposable diaper, for its interesting properties (toughness, lightness...) by comparing with traditional product previously used. SURFACTANTs are widely used to reduce contact angle (water contact angles larger than 90° on smooth surfaces) and to change wetting properties. In the present work, we study ways to obtain hydrophilic polypropylene surface, by the deposition of a variety of surfactant on surfaces of varying morphology. We used two different methods for surface wetting: Spraying method and the coating method. The concentration of the wetting agent, the type of non-woven fabric and the parameters in the method for controlling, hugely affect the quality of treatment. Therefore need that the treatment is effective in terms of contact angle without affecting the mechanical properties of the nonwoven. For the assessment of the quality of treatment, two methods are used: The measurement of the contact angle and the strike trough time. Also, with subjective evaluation by Hedonic test (which involves the consumer preference (naive panel: group of moms). Finally, we selected the better treated topsheet referring to the assessment results.

Keywords: SURFACTANT, topsheet polypropylene, hydrophilic, hydrophobic

Procedia PDF Downloads 545
38145 Analysis of Mutation Associated with Male Infertility in Patients and Healthy Males in the Russian Population

Authors: Svetlana Zhikrivetskaya, Nataliya Shirokova, Roman Bikanov, Elizaveta Musatova, Yana Kovaleva, Nataliya Vetrova, Ekaterina Pomerantseva

Abstract:

Nowadays there is a growing number of couples with conceiving problems due to male or female infertility. Genetic abnormalities are responsible for about 31% of all cases of male infertility. These abnormalities include both chromosomal aberrations or aneuploidies and mutations in certain genes. Chromosomal abnormalities can be easily identified, thus the development of screening panels able to reveal genetic reasons of male infertility on gene level is of current interest. There are approximately 2,000 genes involved in male fertility that is the reason why it is very important to determine the most clinically relevant in certain population and ethnic conditions. An infertility screening panel containing 48 mutations in genes AMHR2, CFTR, DNAI1, HFE, KAL1, TSSK2 and AZF locus which are the most clinically relevant for the European population according to databases NCBI and ClinVar was designed. The aim of this research was to confirm clinic relevance of these mutations in the Russian population. Genotyping was performed in 220 patients with different types of male infertility and in 57 healthy males with normozoospermia. Mutations were identified by end-point PCR with TaqMan probes in microfluidic plates. The frequency of 5 mutations in healthy males and 13 mutations in patients with infertility was revealed and estimated. The frequency of mutation c.187C>G in HFE gene was significantly lower for healthy males (8.8%) compared with patients (17.7%) and the values for the European population according to ExAc database (13.7%) and dbSNP (17.2%). Analysis of c.3454G>C, and c.1545_1546delTA mutations in the CFTR gene revealed increased frequency (0.9 and 0.2%, respectively) in patients with infertility compared with data for the European population (0.04%, respectively (ExAc, European (Non-Finnish) and for the Aggregated Populations (0.002% (ExAc), because there is no data for European population for c.1545_1546delTA mutation. The frequency of del508 mutation (CFTR) in patients (1.59%) were lower comparing with male infertility Europeans (3.34-6.25% depending on nationality) and at the same level with healthy Europeans (1.06%, ExAc, European (Non-Finnish). Analysis of c.845G>A (HFE) mutation resulted in decreased frequency in patients (1.8%) in contrast with the European population data (5.1%, respectively, ExAc, European (Non-Finnish). Moreover, obtained data revealed no statistically significant frequency difference for c.845G>A mutation (HFE) between healthy males in the Russian and the European populations. Allele frequencies of mutations c.350G>A (CFTR), c.193A>T (HFE), c.774C>T, and c.80A>G (gene TSSK2) showed no significantly difference among patients with infertility, healthy males and Europeans. Analysis of AZF locus revealed increased frequency for AZFc microdeletion in patients with male infertility. Thereby, the new data of the allele frequencies in infertility patients in the Russian population was obtained. As well as the frequency differences of mutations associated with male infertility among patients, healthy males in the Russian population and the European one were estimated. The revealed differences showed that for high effectiveness of screening panel detecting genetically caused male infertility it is very important to consider ethnic and population characteristics of patients which will be screened.

Keywords: allele frequency, azoospermia, male infertility, mutation, population

Procedia PDF Downloads 392
38144 The Sensitivity of Credit Defaults Swaps Premium to Global Risk Factor: Evidence from Emerging Markets

Authors: Oguzhan Cepni, Doruk Kucuksarac, M. Hasan Yilmaz

Abstract:

Changes in the global risk appetite cause co-movement in emerging market risk premiums. However, the sensitivity of the changes in risk premium to the global risk appetite may vary across emerging markets. In this study, how the global risk appetite affects Credit Default Swap (CDS) premiums in emerging markets are analyzed using Principal Component Analysis (PCA) and rolling regressions. The PCA results indicate that the first common component derived by the PCA accounts for almost 76 percent of the common variation in CDS premiums. Additionally, the explanatory power of the first factor seems to be high over the sample period. However, the sensitivity to the global risk factor tends to change over time and across countries. In this regard, fixed effects panel regressions are used to identify the macroeconomic factors driving the heterogeneity across emerging markets. The panel regression results point to the significance of government debt to GDP and international reserves to GDP in explaining sensitivity. Accordingly, countries with lower government debt and higher reserves tend to be less subject to the variations in the global risk appetite.

Keywords: credit default swaps, emerging markets, principal components analysis, sovereign risk

Procedia PDF Downloads 378
38143 Investigating The Nexus Between Energy Deficiency, Environmental Sustainability and Renewable Energy: The Role of Energy Trade in Global Perspectives

Authors: Fahim Ullah, Muhammad Usman

Abstract:

Energy consumption and environmental sustainability are hard challenges of 21st century. Energy richness increases environmental pollution while energy poverty hinders economic growth. Considering these two aspects, present study calculates energy deficiency and examines the role of renewable energy to overcome rising energy deficiency and carbon emission for selected countries from 1990 to 2021. For empirical analysis, this study uses methods of moments panel quantile regression analysis and to check the robustness, study used panel quantile robust analysis. Graphical analysis indicated rising global energy deficiency since last three decades where energy consumption is higher than energy production. Empirical results showed that renewable energy is a significant factor for reducing energy deficiency. Secondly, the energy deficiency increases carbon emission level and again renewable energy decreases emissions level. This study recommends that global energy deficiency and rising carbon emissions can be controlled through structural change in the form of energy transition to replace non-renewable resources with renewable resources.

Keywords: energy deficiency, renewable energy, carbon emission, energy trade, PQL analysis

Procedia PDF Downloads 64
38142 Acoustic Induced Vibration Response Analysis of Honeycomb Panel

Authors: Po-Yuan Tung, Jen-Chueh Kuo, Chia-Ray Chen, Chien-Hsing Li, Kuo-Liang Pan

Abstract:

The main-body structure of satellite is mainly constructed by lightweight material, it should be able to withstand certain vibration load during launches. Since various kinds of change possibility in the space, it is an extremely important work to study the random vibration response of satellite structure. This paper based on the reciprocity relationship between sound and structure response and it will try to evaluate the dynamic response of satellite main body under random acoustic load excitation. This paper will study the technical process and verify the feasibility of sonic-borne vibration analysis. One simple plate exposed to the uniform acoustic field is utilized to take some important parameters and to validate the acoustics field model of the reverberation chamber. Then import both structure and acoustic field chamber models into the vibro-acoustic coupling analysis software to predict the structure response. During the modeling process, experiment verification is performed to make sure the quality of numerical models. Finally, the surface vibration level can be calculated through the modal participation factor, and the analysis results are presented in PSD spectrum.

Keywords: vibration, acoustic, modal, honeycomb panel

Procedia PDF Downloads 555
38141 Value Chain Based New Business Opportunity

Authors: Seonjae Lee, Sungjoo Lee

Abstract:

Excavation is necessary to remain competitive in the current business environment. The company survived the rapidly changing industry conditions by adapting new business strategy and reducing technology challenges. Traditionally, the two methods are conducted excavations for new businesses. The first method is, qualitative analysis of expert opinion, which is gathered through opportunities and secondly, new technologies are discovered through quantitative data analysis of method patents. The second method increases time and cost. Patent data is restricted for use and the purpose of discovering business opportunities. This study presents the company's characteristics (sector, size, etc.), of new business opportunities in customized form by reviewing the value chain perspective and to contributing to creating new business opportunities in the proposed model. It utilizes the trademark database of the Korean Intellectual Property Office (KIPO) and proprietary company information database of the Korea Enterprise Data (KED). This data is key to discovering new business opportunities with analysis of competitors and advanced business trademarks (Module 1) and trading analysis of competitors found in the KED (Module 2).

Keywords: value chain, trademark, trading analysis, new business opportunity

Procedia PDF Downloads 372
38140 Online Battery Equivalent Circuit Model Estimation on Continuous-Time Domain Using Linear Integral Filter Method

Authors: Cheng Zhang, James Marco, Walid Allafi, Truong Q. Dinh, W. D. Widanage

Abstract:

Equivalent circuit models (ECMs) are widely used in battery management systems in electric vehicles and other battery energy storage systems. The battery dynamics and the model parameters vary under different working conditions, such as different temperature and state of charge (SOC) levels, and therefore online parameter identification can improve the modelling accuracy. This paper presents a way of online ECM parameter identification using a continuous time (CT) estimation method. The CT estimation method has several advantages over discrete time (DT) estimation methods for ECM parameter identification due to the widely separated battery dynamic modes and fast sampling. The presented method can be used for online SOC estimation. Test data are collected using a lithium ion cell, and the experimental results show that the presented CT method achieves better modelling accuracy compared with the conventional DT recursive least square method. The effectiveness of the presented method for online SOC estimation is also verified on test data.

Keywords: electric circuit model, continuous time domain estimation, linear integral filter method, parameter and SOC estimation, recursive least square

Procedia PDF Downloads 383
38139 Installing Photovoltaic Panels to Generate Optimal Energy in SPAV Hostel, Vijayawada

Authors: J. Jayasuriya

Abstract:

In this research paper, a procedure for installing and assessment of a solar PV plant to generate optimal solar energy SPAV hostel at Vijayawada city was analyzed. The hostel was experiencing power disruption and had a need for an unceasing energy source. The solar panel is one of the best solutions to obtain uninterrupted clean renewable energy for an institutional building as it neither makes din nor pollutes the atmosphere. The electricity usage per month was initially measured to discriminate the energy change. The solar array was installed with its financial and environmental assessment considering recent market prices. All the aspects related to a solar PV plant were considered for the feasibility and efficiency of PV plant near this site i.e., the orientation of the site, the size and shape of the terrace, the sun path were considered while installing panels. Various precautions were taken to intercept the factors which cause interference in energy generation, with respect to temperature, overshadowing, the wiring of panels, pollution etc. The solar panels were frequently installed, monitored and maintained properly to procure optimal energy output. Result obtained with the assessment of the proposed plant and deflation in the electric bill will show the maximal energy that can be generated in a month on that particular site.

Keywords: solar efficiency, building sustainability, PV panel, solar energy

Procedia PDF Downloads 136
38138 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment

Authors: Michael Gidey Gebru

Abstract:

Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.

Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output

Procedia PDF Downloads 57
38137 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance

Authors: Flora Babongo, Valerie Chavez

Abstract:

Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.

Keywords: causal inference, DAGs, BAMLSS, financial index

Procedia PDF Downloads 151
38136 Efficiency Improvement of Ternary Nanofluid Within a Solar Photovoltaic Unit Combined with Thermoelectric Considering Environmental Analysis

Authors: Mohsen Sheikholeslami, Zahra Khalili, Ladan Momayez

Abstract:

Impacts of environmental parameters and dust deposition on the efficiency of solar panel have been scrutinized in this article. To gain thermal output, trapezoidal cooling channel has been attached in the bottom of the panel incorporating ternary nanofluid. To produce working fluid, water has been mixed with Fe₃O₄-TiO₂-GO nanoparticles. Also, the arrangement of fins has been considered to grow the cooling rate of the silicon layer. The existence of a thermoelectric layer above the cooling channel leads to higher electrical output. Efficacy of ambient temperature (Ta), speed of wind (V𝓌ᵢₙ𝒹) and inlet temperature (Tᵢₙ) and velocity (Vin) of ternary nanofluid on performance of PVT has been assessed. As Tin increases, electrical efficiency declines about 3.63%. Increase of ambient temperature makes thermal performance enhance about 33.46%. The PVT efficiency decreases about 13.14% and 16.6% with augment of wind speed and dust deposition. CO₂ mitigation has been reduced about 15.49% in presence of dust while it increases about 17.38% with growth of ambient temperature.

Keywords: photovoltaic system, CO₂ mitigation, ternary nanofluid, thermoelectric generator, environmental parameters, trapezoidal cooling channel

Procedia PDF Downloads 89
38135 Data Transformations in Data Envelopment Analysis

Authors: Mansour Mohammadpour

Abstract:

Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.

Keywords: data transformation, data envelopment analysis, undesirable data, negative data

Procedia PDF Downloads 20
38134 The Use of Ward Linkage in Cluster Integration with a Path Analysis Approach

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

Path analysis is an analytical technique to study the causal relationship between independent and dependent variables. In this study, the integration of Clusters in the Ward Linkage method was used in a variety of clusters with path analysis. The variables used are character (x₁), capacity (x₂), capital (x₃), collateral (x₄), and condition of economy (x₄) to on time pay (y₂) through the variable willingness to pay (y₁). The purpose of this study was to compare the Ward Linkage method cluster integration in various clusters with path analysis to classify willingness to pay (y₁). The data used are primary data from questionnaires filled out by customers of Bank X, using purposive sampling. The measurement method used is the average score method. The results showed that the Ward linkage method cluster integration with path analysis on 2 clusters is the best method, by comparing the coefficient of determination. Variable character (x₁), capacity (x₂), capital (x₃), collateral (x₄), and condition of economy (x₅) to on time pay (y₂) through willingness to pay (y₁) can be explained by 58.3%, while the remaining 41.7% is explained by variables outside the model.

Keywords: cluster integration, linkage, path analysis, compliant paying behavior

Procedia PDF Downloads 186
38133 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging

Procedia PDF Downloads 155
38132 Applying Hybrid Graph Drawing and Clustering Methods on Stock Investment Analysis

Authors: Mouataz Zreika, Maria Estela Varua

Abstract:

Stock investment decisions are often made based on current events of the global economy and the analysis of historical data. Conversely, visual representation could assist investors’ gain deeper understanding and better insight on stock market trends more efficiently. The trend analysis is based on long-term data collection. The study adopts a hybrid method that combines the Clustering algorithm and Force-directed algorithm to overcome the scalability problem when visualizing large data. This method exemplifies the potential relationships between each stock, as well as determining the degree of strength and connectivity, which will provide investors another understanding of the stock relationship for reference. Information derived from visualization will also help them make an informed decision. The results of the experiments show that the proposed method is able to produced visualized data aesthetically by providing clearer views for connectivity and edge weights.

Keywords: clustering, force-directed, graph drawing, stock investment analysis

Procedia PDF Downloads 302
38131 Crossing Multi-Source Climate Data to Estimate the Effects of Climate Change on Evapotranspiration Data: Application to the French Central Region

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

Climatic factors are the subject of considerable research, both methodologically and instrumentally. Under the effect of climate change, the approach to climate parameters with precision remains one of the main objectives of the scientific community. This is from the perspective of assessing climate change and its repercussions on humans and the environment. However, many regions of the world suffer from a severe lack of reliable instruments that can make up for this deficit. Alternatively, the use of empirical methods becomes the only way to assess certain parameters that can act as climate indicators. Several scientific methods are used for the evaluation of evapotranspiration which leads to its evaluation either directly at the level of the climatic stations or by empirical methods. All these methods make a point approach and, in no case, allow the spatial variation of this parameter. We, therefore, propose in this paper the use of three sources of information (network of weather stations of Meteo France, World Databases, and Moodis satellite images) to evaluate spatial evapotranspiration (ETP) using the Turc method. This first step will reflect the degree of relevance of the indirect (satellite) methods and their generalization to sites without stations. The spatial variation representation of this parameter using the geographical information system (GIS) accounts for the heterogeneity of the behaviour of this parameter. This heterogeneity is due to the influence of site morphological factors and will make it possible to appreciate the role of certain topographic and hydrological parameters. A phase of predicting the evolution over the medium and long term of evapotranspiration under the effect of climate change by the application of the Intergovernmental Panel on Climate Change (IPCC) scenarios gives a realistic overview as to the contribution of aquatic systems to the scale of the region.

Keywords: climate change, ETP, MODIS, GIEC scenarios

Procedia PDF Downloads 100
38130 Wind Tunnel Tests on Ground-Mounted and Roof-Mounted Photovoltaic Array Systems

Authors: Chao-Yang Huang, Rwey-Hua Cherng, Chung-Lin Fu, Yuan-Lung Lo

Abstract:

Solar energy is one of the replaceable choices to reduce the CO2 emission produced by conventional power plants in the modern society. As an island which is frequently visited by strong typhoons and earthquakes, it is an urgent issue for Taiwan to make an effort in revising the local regulations to strengthen the safety design of photovoltaic systems. Currently, the Taiwanese code for wind resistant design of structures does not have a clear explanation on photovoltaic systems, especially when the systems are arranged in arrayed format. Furthermore, when the arrayed photovoltaic system is mounted on the rooftop, the approaching flow is significantly altered by the building and led to different pressure pattern in the different area of the photovoltaic system. In this study, L-shape arrayed photovoltaic system is mounted on the ground of the wind tunnel and then mounted on the building rooftop. The system is consisted of 60 PV models. Each panel model is equivalent to a full size of 3.0 m in depth and 10.0 m in length. Six pressure taps are installed on the upper surface of the panel model and the other six are on the bottom surface to measure the net pressures. Wind attack angle is varied from 0° to 360° in a 10° interval for the worst concern due to wind direction. The sampling rate of the pressure scanning system is set as high enough to precisely estimate the peak pressure and at least 20 samples are recorded for good ensemble average stability. Each sample is equivalent to 10-minute time length in full scale. All the scale factors, including timescale, length scale, and velocity scale, are properly verified by similarity rules in low wind speed wind tunnel environment. The purpose of L-shape arrayed system is for the understanding the pressure characteristics at the corner area. Extreme value analysis is applied to obtain the design pressure coefficient for each net pressure. The commonly utilized Cook-and-Mayne coefficient, 78%, is set to the target non-exceedance probability for design pressure coefficients under Gumbel distribution. Best linear unbiased estimator method is utilized for the Gumbel parameter identification. Careful time moving averaging method is also concerned in data processing. Results show that when the arrayed photovoltaic system is mounted on the ground, the first row of the panels reveals stronger positive pressure than that mounted on the rooftop. Due to the flow separation occurring at the building edge, the first row of the panels on the rooftop is most in negative pressures; the last row, on the other hand, shows positive pressures because of the flow reattachment. Different areas also have different pressure patterns, which corresponds well to the regulations in ASCE7-16 describing the area division for design values. Several minor observations are found according to parametric studies, such as rooftop edge effect, parapet effect, building aspect effect, row interval effect, and so on. General comments are then made for the proposal of regulation revision in Taiwanese code.

Keywords: aerodynamic force coefficient, ground-mounted, roof-mounted, wind tunnel test, photovoltaic

Procedia PDF Downloads 138
38129 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 197
38128 Study of Evapotranspiration for Pune District

Authors: Ranjeet Sable, Mahotsavi Patil, Aadesh Nimbalkar, Prajakta Palaskar, Ritu Sagar

Abstract:

The exact amount of water used by various crops in different climatic conditions is necessary to step for design, planning, and management of irrigation schemes, water resources, scheduling of irrigation systems. Evaporation and transpiration are combinable called as evapotranspiration. Water loss from trees during photosynthesis is called as transpiration and when water gets converted into gaseous state is called evaporation. For calculation of correct evapotranspiration, we have to choose the method in such way that is should be suitable and require minimum climatic data also it should be applicable for wide range of climatic conditions. In hydrology, there are multiple correlations and regression is generally used to develop relationships between three or more hydrological variables by knowing the dependence between them. This research work includes the study of various methods for calculation of evapotranspiration and selects reasonable and suitable one Pune region (Maharashtra state). As field methods are very costly, time-consuming and not give appropriate results if the suitable climate is not maintained. Observation recorded at Pune metrological stations are used to calculate evapotranspiration with the help of Radiation Method (RAD), Modified Penman Method (MPM), Thornthwaite Method (THW), Blaney-Criddle (BCL), Christiansen Equation (CNM), Hargreaves Method (HGM), from which Hargreaves and Thornthwaite are temperature based methods. Performance of all these methods are compared with Modified Penman method and method which showing less variation with standard Modified Penman method (MPM) is selected as the suitable one. Evapotranspiration values are estimated on a monthly basis. Comparative analysis in this research used for selection for raw data-dependent methods in case of missing data.

Keywords: Blaney-Criddle, Christiansen equation evapotranspiration, Hargreaves method, precipitations, Penman method, water use efficiency

Procedia PDF Downloads 271
38127 Applying Different Stenography Techniques in Cloud Computing Technology to Improve Cloud Data Privacy and Security Issues

Authors: Muhammad Muhammad Suleiman

Abstract:

Cloud Computing is a versatile concept that refers to a service that allows users to outsource their data without having to worry about local storage issues. However, the most pressing issues to be addressed are maintaining a secure and reliable data repository rather than relying on untrustworthy service providers. In this study, we look at how stenography approaches and collaboration with Digital Watermarking can greatly improve the system's effectiveness and data security when used for Cloud Computing. The main requirement of such frameworks, where data is transferred or exchanged between servers and users, is safe data management in cloud environments. Steganography is the cloud is among the most effective methods for safe communication. Steganography is a method of writing coded messages in such a way that only the sender and recipient can safely interpret and display the information hidden in the communication channel. This study presents a new text steganography method for hiding a loaded hidden English text file in a cover English text file to ensure data protection in cloud computing. Data protection, data hiding capability, and time were all improved using the proposed technique.

Keywords: cloud computing, steganography, information hiding, cloud storage, security

Procedia PDF Downloads 191