Search results for: decision-making approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5039

Search results for: decision-making approach

509 Distributed Automation System Based Remote Monitoring of Power Quality Disturbance on LV Network

Authors: Emmanuel D. Buedi, K. O. Boateng, Griffith S. Klogo

Abstract:

Electrical distribution networks are prone to power quality disturbances originating from the complexity of the distribution network, mode of distribution (overhead or underground) and types of loads used by customers. Data on the types of disturbances present and frequency of occurrence is needed for economic evaluation and hence finding solution to the problem. Utility companies have resorted to using secondary power quality devices such as smart meters to help gather the required data. Even though this approach is easier to adopt, data gathered from these devices may not serve the required purpose, since the installation of these devices in the electrical network usually does not conform to available PQM placement methods. This paper presents a design of a PQM that is capable of integrating into an existing DAS infrastructure to take advantage of available placement methodologies. The monitoring component of the design is implemented and installed to monitor an existing LV network. Data from the monitor is analyzed and presented. A portion of the LV network of the Electricity Company of Ghana is modeled in MATLAB-Simulink and analyzed under various earth fault conditions. The results presented show the ability of the PQM to detect and analyze PQ disturbance such as voltage sag and overvoltage. By adopting a placement methodology and installing these nodes, utilities are assured of accurate and reliable information with respect to the quality of power delivered to consumers.

Keywords: Power quality, remote monitoring, distributed automation system, economic evaluation, LV network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137
508 Instant Location Detection of Objects Moving at High-Speedin C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data of the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as «signaling parameters» (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of COTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources, but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as rule. This report contains describing of the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.

Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
507 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
506 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity

Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Mujeeb Ur Rehman, Saifur Rahman Sabuj

Abstract:

This paper examines relationships between solar activity and earthquakes, it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity, and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to effect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth

.

Keywords: K-Nearest Neighbour, Support Vector Regression, Random Forest Regression, Long Short-Term Memory Network, earthquakes, solar activity, sunspot number, solar wind, solar flares.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203
505 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators

Authors: Andrea Bellucci, Martina Tofi

Abstract:

The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.

Keywords: Balance sheet indicators, Bancassurance, business models, ward algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
504 Full Potential Study of Electronic and Optical Properties of NdF3

Authors: Sapan Mohan Saini

Abstract:

We report the electronic structure and optical properties of NdF3 compound. Our calculations are based on density functional theory (DFT) using the full potential linearized augmented plane wave (FPLAPW) method with the inclusion of spin orbit coupling. We employed the local spin density approximation (LSDA) and Coulomb-corrected local spin density approximation, known for treating the highly correlated 4f electrons properly, is able to reproduce the correct insulating ground state. We find that the standard LSDA approach is incapable of correctly describing the electronic properties of such materials since it positions the f-bands incorrectly resulting in an incorrect metallic ground state. On the other hand, LSDA + U approximation, known for treating the highly correlated 4f electrons properly, is able to reproduce the correct insulating ground state. Interestingly, however, we do not find any significant differences in the optical properties calculated using LSDA, and LSDA + U suggesting that the 4f electrons do not play a decisive role in the optical properties of these compounds. The reflectivity for NdF3 compound stays low till 7 eV which is consistent with their large energy gaps. The calculated energy gaps are in good agreement with experiments. Our calculated reflectivity compares well with the experimental data and the results are analyzed in the light of band to band transitions.

Keywords: FPLAPW Method, optical properties, rare earthtrifluorides LSDA+U

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
503 Prediction of Compressive Strength of Concrete from Early Age Test Result Using Design of Experiments (RSM)

Authors: Salem Alsanusi, Loubna Bentaher

Abstract:

Response Surface Methods (RSM) provide statistically validated predictive models that can then be manipulated for finding optimal process configurations. Variation transmitted to responses from poorly controlled process factors can be accounted for by the mathematical technique of propagation of error (POE), which facilitates ‘finding the flats’ on the surfaces generated by RSM. The dual response approach to RSM captures the standard deviation of the output as well as the average. It accounts for unknown sources of variation. Dual response plus propagation of error (POE) provides a more useful model of overall response variation. In our case, we implemented this technique in predicting compressive strength of concrete of 28 days in age. Since 28 days is quite time consuming, while it is important to ensure the quality control process. This paper investigates the potential of using design of experiments (DOE-RSM) to predict the compressive strength of concrete at 28th day. Data used for this study was carried out from experiment schemes at university of Benghazi, civil engineering department. A total of 114 sets of data were implemented. ACI mix design method was utilized for the mix design. No admixtures were used, only the main concrete mix constituents such as cement, coarseaggregate, fine aggregate and water were utilized in all mixes. Different mix proportions of the ingredients and different water cement ratio were used. The proposed mathematical models are capable of predicting the required concrete compressive strength of concrete from early ages.

Keywords: Mix proportioning, response surface methodology, compressive strength, optimal design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
502 Numerical and Experimental Investigation of Airflow inside a Car Cabin

Authors: Mokhtar Djeddou, Amine Mehel, Georges Fokoua, Anne Tanière, Patrick Chevrier

Abstract:

Commuters’ exposure to air pollution, particularly to particle matter inside vehicles, is a significant health issue. Assessing particle concentrations and characterizing their distribution is an important first step in understanding and proposing solutions to improve car cabin air quality. It is known that particle dynamics is intimately driven by particle-turbulence interactions. In order to analyze and model pollutants distribution inside car cabins, it is crucial to examine first the single-phase flow topology and its associated turbulence characteristics. Within this context, Computational Fluid Dynamics (CFD) simulations were conducted to model airflow inside a full-scale car cabin using Reynolds Averaged Navier-Stokes (RANS) approach combined with the first order Realizable k-ε model to close the RANS equations. To assess the numerical model, a campaign of velocity field measurements at different locations in the front and back of the car cabin has been carried out using hot-wire anemometry technique. Comparison between numerical and experimental results shows a good agreement of velocity profiles. Additionally, visualization of streamlines shows the formation of jet flow developing out of the dashboard air vents and the formation of large vortex structures, particularly between the front and back-seat compartments. These vortical structures could play a key role in the accumulation and clustering of particles in a turbulent flow.

Keywords: Car cabin, CFD, hot-wire anemometry, vortical flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 468
501 Business Process Management and Organizational Culture in Big Companies: Cross-Country Analysis

Authors: Dalia Suša Vugec

Abstract:

Business process management (BPM) is widely used approach focused on designing, mapping, changing, managing and analyzing business processes of an organization, which eventually leads to better performance and derives many other benefits. Since every organization strives to improve its performance in order to be sustainable and to remain competitive on the market in long-term period, numerous organizations are nowadays adopting and implementing BPM. However, not all organizations are equally successful in that. One of the ways of measuring BPM success is by measuring its maturity by calculating Process Performance Index (PPI) using ten BPM success factors. Still, although BPM is a holistic concept, organizational culture is not taken into consideration in calculating PPI. Hence, aim of this paper is twofold; first, it aims to explore and analyze the current state of BPM success factors within the big organizations from Slovenia, Croatia, and Austria and second, it aims to analyze the structure of organizational culture within the observed companies, focusing on the link with BPM success factors as well. The presented study is based on the results of the questionnaire conducted as the part of the PROSPER project (IP-2014-09-3729) and financed by Croatian Science Foundation. The results of the questionnaire reveal differences in the achieved levels of BPM success factors and therefore BPM maturity in total between the three observed countries. Moreover, the structure of organizational culture across three countries also differs. This paper discusses the revealed differences between countries as well as the link between organizational culture and BPM success factors.

Keywords: Business process management, BPM maturity, BPM success factors, organizational culture, process performance index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
500 Managing Uncertainty in Unmanned Aircraft System Safety Performance Requirements Compliance Process

Authors: Achim Washington, Reece Clothier, Jose Silva

Abstract:

System Safety Regulations (SSR) are a central component to the airworthiness certification of Unmanned Aircraft Systems (UAS). There is significant debate on the setting of appropriate SSR for UAS. Putting this debate aside, the challenge lies in how to apply the system safety process to UAS, which lacks the data and operational heritage of conventionally piloted aircraft. The limited knowledge and lack of operational data result in uncertainty in the system safety assessment of UAS. This uncertainty can lead to incorrect compliance findings and the potential certification and operation of UAS that do not meet minimum safety performance requirements. The existing system safety assessment and compliance processes, as used for conventional piloted aviation, do not adequately account for the uncertainty, limiting the suitability of its application to UAS. This paper discusses the challenges of undertaking system safety assessments for UAS and presents current and envisaged research towards addressing these challenges. It aims to highlight the main advantages associated with adopting a risk based framework to the System Safety Performance Requirement (SSPR) compliance process that is capable of taking the uncertainty associated with each of the outputs of the system safety assessment process into consideration. Based on this study, it is made clear that developing a framework tailored to UAS, would allow for a more rational, transparent and systematic approach to decision making. This would reduce the need for conservative assumptions and take the risk posed by each UAS into consideration while determining its state of compliance to the SSR.

Keywords: Part 1309 regulations, unmanned aircraft systems, system safety, uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1134
499 Development of an Intelligent Decision Support System for Smart Viticulture

Authors: C. M. Balaceanu, G. Suciu, C. S. Bosoc, O. Orza, C. Fernandez, Z. Viniczay

Abstract:

The Internet of Things (IoT) represents the best option for smart vineyard applications, even if it is necessary to integrate the technologies required for the development. This article is based on the research and the results obtained in the DISAVIT project. For Smart Agriculture, the project aims to provide a trustworthy, intelligent, integrated vineyard management solution that is based on the IoT. To have interoperability through the use of a multiprotocol technology (being the future connected wireless IoT) it is necessary to adopt an agnostic approach, providing a reliable environment to address cyber security, IoT-based threats and traceability through blockchain-based design, but also creating a concept for long-term implementations (modular, scalable). The ones described above represent the main innovative technical aspects of this project. The DISAVIT project studies and promotes the incorporation of better management tools based on objective data-based decisions, which are necessary for agriculture adapted and more resistant to climate change. It also exploits the opportunities generated by the digital services market for smart agriculture management stakeholders. The project's final result aims to improve decision-making, performance, and viticulturally infrastructure and increase real-time data accuracy and interoperability. Innovative aspects such as end-to-end solutions, adaptability, scalability, security and traceability, place our product in a favorable situation over competitors. None of the solutions in the market meet every one of these requirements by a unique product being innovative.

Keywords: Blockchain, IoT, smart agriculture, vineyard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
498 Computational Fluid Dynamics Simulation Approach for Developing a Powder Dispensing Device

Authors: Rallapalli Revanth, Shivakumar Bhavi, Vijay Kumar Turaga

Abstract:

Dispensing powders manually can be difficult as it requires to gradually pour and check the amount on the scale to be dispensed. Current systems are manual and non-continuous in nature and is user dependent and it is also difficult to control powder dispensation. Recurrent dosing of powdered medicines in precise amounts quickly and accurately has been an all-time challenge. Various powder dispensing mechanisms are being designed to overcome these challenges. Battery operated screw conveyor mechanism is being innovated to overcome above problems faced. These inventions are numerically evaluated at concept development level by employing Computational Fluid Dynamics (CFD) of gas-solids multiphase flow systems. CFD has been very helpful in the development of such devices, saving time and money by reducing the number of prototypes and testing. In this study, powder dispensation from the trocar's end is simulated by using the Dense Discrete Phase Model technique along with Kinetic Theory of Granular Flow. The powder is viewed as a secondary flow in air (DDPM-KTGF). By considering the volume fraction of powder as 50%, the transportation side is done by rotation of the screw conveyor. The performance is calculated for 1 sec time frame in an unsteady computation manner. This methodology will help designers in developing design concepts to improve the dispensation and the effective area within a quick turnaround time frame.

Keywords: Multiphase flow, screw conveyor, transient, DDPM - KTGF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 373
497 Urban Growth Analysis Using Multi-Temporal Satellite Images, Non-stationary Decomposition Methods and Stochastic Modeling

Authors: Ali Ben Abbes, ImedRiadh Farah, Vincent Barra

Abstract:

Remotely sensed data are a significant source for monitoring and updating databases for land use/cover. Nowadays, changes detection of urban area has been a subject of intensive researches. Timely and accurate data on spatio-temporal changes of urban areas are therefore required. The data extracted from multi-temporal satellite images are usually non-stationary. In fact, the changes evolve in time and space. This paper is an attempt to propose a methodology for changes detection in urban area by combining a non-stationary decomposition method and stochastic modeling. We consider as input of our methodology a sequence of satellite images I1, I2, … In at different periods (t = 1, 2, ..., n). Firstly, a preprocessing of multi-temporal satellite images is applied. (e.g. radiometric, atmospheric and geometric). The systematic study of global urban expansion in our methodology can be approached in two ways: The first considers the urban area as one same object as opposed to non-urban areas (e.g. vegetation, bare soil and water). The objective is to extract the urban mask. The second one aims to obtain a more knowledge of urban area, distinguishing different types of tissue within the urban area. In order to validate our approach, we used a database of Tres Cantos-Madrid in Spain, which is derived from Landsat for a period (from January 2004 to July 2013) by collecting two frames per year at a spatial resolution of 25 meters. The obtained results show the effectiveness of our method.

Keywords: Multi-temporal satellite image, urban growth, Non-stationarity, stochastic modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
496 Cross Signal Identification for PSG Applications

Authors: Carmen Grigoraş, Victor Grigoraş, Daniela Boişteanu

Abstract:

The standard investigational method for obstructive sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG), which consists of a simultaneous, usually overnight recording of multiple electro-physiological signals related to sleep and wakefulness. This is an expensive, encumbering and not a readily repeated protocol, and therefore there is need for simpler and easily implemented screening and detection techniques. Identification of apnea/hypopnea events in the screening recordings is the key factor for the diagnosis of OSAS. The analysis of a solely single-lead electrocardiographic (ECG) signal for OSAS diagnosis, which may be done with portable devices, at patient-s home, is the challenge of the last years. A novel artificial neural network (ANN) based approach for feature extraction and automatic identification of respiratory events in ECG signals is presented in this paper. A nonlinear principal component analysis (NLPCA) method was considered for feature extraction and support vector machine for classification/recognition. An alternative representation of the respiratory events by means of Kohonen type neural network is discussed. Our prospective study was based on OSAS patients of the Clinical Hospital of Pneumology from Iaşi, Romania, males and females, as well as on non-OSAS investigated human subjects. Our computed analysis includes a learning phase based on cross signal PSG annotation.

Keywords: Artificial neural networks, feature extraction, obstructive sleep apnea syndrome, pattern recognition, signalprocessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
495 Web Proxy Detection via Bipartite Graphs and One-Mode Projections

Authors: Zhipeng Chen, Peng Zhang, Qingyun Liu, Li Guo

Abstract:

With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.

Keywords: Bipartite graph, clustering, one-mode projection, web proxy detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 747
494 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: Cold-start, expectation propagation, multi-armed bandits, Thompson sampling, variational inference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 552
493 Systematic Examination of Methods Supporting the Social Innovation Process

Authors: Mariann Veresne Somosi, Zoltan Nagy, Krisztina Varga

Abstract:

Innovation is the key element of economic development and a key factor in social processes. Technical innovations can be identified as prerequisites and causes of social change and cannot be created without the renewal of society. The study of social innovation can be characterised as one of the significant research areas of our day. The study’s aim is to identify the process of social innovation, which can be defined by input, transformation, and output factors. This approach divides the social innovation process into three parts: situation analysis, implementation, follow-up. The methods associated with each stage of the process are illustrated by the chronological line of social innovation. In this study, we have sought to present methodologies that support long- and short-term decision-making that is easy to apply, have different complementary content, and are well visualised for different user groups. When applying the methods, the reference objects are different: county, district, settlement, specific organisation. The solution proposed by the study supports the development of a methodological combination adapted to different situations. Having reviewed metric and conceptualisation issues, we wanted to develop a methodological combination along with a change management logic suitable for structured support to the generation of social innovation in the case of a locality or a specific organisation. In addition to a theoretical summary, in the second part of the study, we want to give a non-exhaustive picture of the two counties located in the north-eastern part of Hungary through specific analyses and case descriptions.

Keywords: Factors of social innovation, methodological combination, social innovation process, supporting decision-making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 656
492 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System

Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi

Abstract:

Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.

Keywords: Channel estimation, OFDM, pilot-assist, VLC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668
491 Hi-Fi Traffic Clearance Technique for Life Saving Vehicles using Differential GPS System

Authors: N. Yuvaraj, V. B. Prakash, D. Venkatraj

Abstract:

This paper may be considered as combination of both pervasive computing and Differential GPS (global positioning satellite) which relates to control automatic traffic signals in such a way as to pre-empt normal signal operation and permit lifesaving vehicles. Before knowing the arrival of the lifesaving vehicles from the signal there is a chance of clearing the traffic. Traffic signal preemption system includes a vehicle equipped with onboard computer system capable of capturing diagnostic information and estimated location of the lifesaving vehicle using the information provided by GPS receiver connected to the onboard computer system and transmitting the information-s using a wireless transmitter via a wireless network. The fleet management system connected to a wireless receiver is capable of receiving the information transmitted by the lifesaving vehicle .A computer is also located at the intersection uses corrected vehicle position, speed & direction measurements, in conjunction with previously recorded data defining approach routes to the intersection, to determine the optimum time to switch a traffic light controller to preemption mode so that lifesaving vehicles can pass safely. In case when the ambulance need to take a “U" turn in a heavy traffic area we suggest a solution. Now we are going to make use of computerized median which uses LINKED BLOCKS (removable) to solve the above problem.

Keywords: Ubiquitous computing, differential GPS, fleet management system, wireless transmitter and receiver computerized median i.e. linked blocks (removable).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
490 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement

Authors: Nadezhda Kvatashidze

Abstract:

The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.

Keywords: Conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2511
489 Image Transmission via Iterative Cellular-Turbo System

Authors: Ersin Gose, Kenan Buyukatak, Onur Osman, Osman N. Ucan

Abstract:

To compress, improve bit error performance and also enhance 2D images, a new scheme, called Iterative Cellular-Turbo System (IC-TS) is introduced. In IC-TS, the original image is partitioned into 2N quantization levels, where N is denoted as bit planes. Then each of the N-bit-plane is coded by Turbo encoder and transmitted over Additive White Gaussian Noise (AWGN) channel. At the receiver side, bit-planes are re-assembled taking into consideration of neighborhood relationship of pixels in 2-D images. Each of the noisy bit-plane values of the image is evaluated iteratively using IC-TS structure, which is composed of equalization block; Iterative Cellular Image Processing Algorithm (ICIPA) and Turbo decoder. In IC-TS, there is an iterative feedback link between ICIPA and Turbo decoder. ICIPA uses mean and standard deviation of estimated values of each pixel neighborhood. It has extra-ordinary satisfactory results of both Bit Error Rate (BER) and image enhancement performance for less than -1 dB Signal-to-Noise Ratio (SNR) values, compared to traditional turbo coding scheme and 2-D filtering, applied separately. Also, compression can be achieved by using IC-TS systems. In compression, less memory storage is used and data rate is increased up to N-1 times by simply choosing any number of bit slices, sacrificing resolution. Hence, it is concluded that IC-TS system will be a compromising approach in 2-D image transmission, recovery of noisy signals and image compression.

Keywords: Iterative Cellular Image Processing Algorithm (ICIPA), Turbo Coding, Iterative Cellular Turbo System (IC-TS), Image Compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814
488 Ownership, Management Responsibility and Corporate Performance of the Listed Firms in Kazakhstan

Authors: Gulnara Moldasheva

Abstract:

The research explores the relationship between management responsibility and corporate governance of listed companies in Kazakhstan. This research employs firm level data of selected listed non-financial firms and firm level data “operational” financial sector, consisted from banking sector, insurance companies and accumulated pension funds using multivariate regression analysis under fixed effect model approach. Ownership structure includes institutional ownership, managerial ownership and private investor’s ownership. Management responsibility of the firm is expressed by the decision of the firm on amount of leverage. Results of the cross sectional panel study for non-financial firms showed that only institutional shareholding is significantly negatively correlated with debt to equity ratio. Findings from “operational” financial sector show that leverage is significantly affected only by the CEO/Chair duality and the size of financial institutions, and insignificantly affected by ownership structure. Also, the findings show, that there is a significant negative relationship between profitability and the debt to equity ratio for non-financial firms, which is consistent with pecking order theory. Generally, the found results suggest that corporate governance and a management responsibility play important role in corporate performance of listed firms in Kazakhstan.

Keywords: Corporate governance, corporate performance, debt to equity ratio, ownership.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
487 A Community Compromised Approach to Combinatorial Coalition Problem

Authors: Laor Boongasame, Veera Boonjing, Ho-fung Leung

Abstract:

Buyer coalition with a combination of items is a group of buyers joining together to purchase a combination of items with a larger discount. The primary aim of existing buyer coalition with a combination of items research is to generate a large total discount. However, the aim is hard to achieve because this research is based on the assumption that each buyer completely knows other buyers- information or at least one buyer knows other buyers- information in a coalition by exchange of information. These assumption contrast with the real world environment where buyers join a coalition with incomplete information, i.e., they concerned only with their expected discounts. Therefore, this paper proposes a new buyer community coalition formation with a combination of items scheme, called the Community Compromised Combinatorial Coalition scheme, under such an environment of incomplete information. In order to generate a larger total discount, after buyers who want to join a coalition propose their minimum required saving, a coalition structure that gives a maximum total retail prices is formed. Then, the total discount division of the coalition is divided among buyers in the coalition depending on their minimum required saving and is a Pareto optimal. In mathematical analysis, we compare concepts of this scheme with concepts of the existing buyer coalition scheme. Our mathematical analysis results show that the total discount of the coalition in this scheme is larger than that in the existing buyer coalition scheme.

Keywords: group decision and negotiations, group buying, gametheory, combinatorial coalition formation, Pareto optimality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
486 Using Environmental Sensitivity Index (ESI) to Assess and Manage Environmental Risks of Pipelines in GIS Environment: A Case Study ofa Near Coastline and Fragile Ecosystem Located Pipeline

Authors: Jahangir Jafari, Nematollah Khorasani, Afshin Danehkar

Abstract:

Having a very many number of pipelines all over the country, Iran is one of the countries consists of various ecosystems with variable degrees of fragility and robusticity as well as geographical conditions. This study presents a state-of-the-art method to estimate environmental risks of pipelines by recommending rational equations including FES, URAS, SRS, RRS, DRS, LURS and IRS as well as FRS to calculate the risks. This study was carried out by a relative semi-quantitative approach based on land uses and HVAs (High-Value Areas). GIS as a tool was used to create proper maps regarding the environmental risks, land uses and distances. The main logic for using the formulas was the distance-based approaches and ESI as well as intersections. Summarizing the results of the study, a risk geographical map based on the ESIs and final risk score (FRS) was created. The study results showed that the most sensitive and so of high risk area would be an area comprising of mangrove forests located in the pipeline neighborhood. Also, salty lands were the most robust land use units in the case of pipeline failure circumstances. Besides, using a state-of-the-art method, it showed that mapping the risks of pipelines out with the applied method is of more reliability and convenience as well as relative comprehensiveness in comparison to present non-holistic methods for assessing the environmental risks of pipelines. The focus of the present study is “assessment" than that of “management". It is suggested that new policies are to be implemented to reduce the negative effects of the pipeline that has not yet been constructed completely

Keywords: ERM, ESI, ERA, Pipeline, Assalouyeh

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171
485 A Corpus-Based Approach to Understanding Market Access in Fisheries and Aquaculture: A Systematic Literature Review

Authors: Cheryl Marie Cordeiro

Abstract:

Although fisheries and aquaculture studies might seem marginal to international business (IB) studies in general, fisheries and aquaculture IB (FAIB) management is currently facing increasing pressure to meet global demand and consumption for fish in the next coming decades. In part address to this challenge, the purpose of this systematic review of literature (SLR) study is to investigate the use of the term ‘market access’ in its context of use in the generic literature and business sector discourse, in comparison to the more specific literature and discourse in fisheries, aquaculture and seafood. This SLR aims to uncover the knowledge/interest gaps between the academic subject discourses and business sector practices. Corpus driven in methodology and using a triangulation method of three different text analysis software including AntConc, VOSviewer and Web of Science (WoS) analytics, the SLR results indicate a gap in conceptual knowledge and business practices in how ‘market access’ is conceived and used in the context of the pharmaceutical healthcare industry and FAIB research and practice. While it is acknowledged that the product orientation of different business sectors might differ, this SLR study works with the assumption that both business sectors are global in orientation. These business sectors are complex in their operations from product to market. This SLR suggests a conceptual model in understanding the challenges, the potential barriers as well as avenues for solutions to developing market access for FAIB.

Keywords: Market access, fisheries and aquaculture, international business, systematic literature review.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 945
484 Expert Based System Design for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

Recently, an increasing number of researchers have been focusing on working out realistic solutions to sustainability problems. As sustainability issues gain higher importance for organisations, the management of such decisions becomes critical. Knowledge representation is a fundamental issue of complex knowledge based systems. Many types of sustainability problems would benefit from models based on experts’ knowledge. Cognitive maps have been used for analyzing and aiding decision making. A cognitive map can be made of almost any system or problem. A fuzzy cognitive map (FCM) can successfully represent knowledge and human experience, introducing concepts to represent the essential elements and the cause and effect relationships among the concepts to model the behaviour of any system. Integrated waste management systems (IWMS) are complex systems that can be decomposed to non-related and related subsystems and elements, where many factors have to be taken into consideration that may be complementary, contradictory, and competitive; these factors influence each other and determine the overall decision process of the system. The goal of the present paper is to construct an efficient IWMS which considers various factors. The authors’ intention is to propose an expert based system design approach for implementing expert decision support in the area of IWMSs and introduces an appropriate methodology for the development and analysis of group FCM. A framework for such a methodology consisting of the development and application phases is presented.

Keywords: Factors, fuzzy cognitive map, group decision, integrated waste management system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
483 EZW Coding System with Artificial Neural Networks

Authors: Saudagar Abdul Khader Jilani, Syed Abdul Sattar

Abstract:

Image compression plays a vital role in today-s communication. The limitation in allocated bandwidth leads to slower communication. To exchange the rate of transmission in the limited bandwidth the Image data must be compressed before transmission. Basically there are two types of compressions, 1) LOSSY compression and 2) LOSSLESS compression. Lossy compression though gives more compression compared to lossless compression; the accuracy in retrievation is less in case of lossy compression as compared to lossless compression. JPEG, JPEG2000 image compression system follows huffman coding for image compression. JPEG 2000 coding system use wavelet transform, which decompose the image into different levels, where the coefficient in each sub band are uncorrelated from coefficient of other sub bands. Embedded Zero tree wavelet (EZW) coding exploits the multi-resolution properties of the wavelet transform to give a computationally simple algorithm with better performance compared to existing wavelet transforms. For further improvement of compression applications other coding methods were recently been suggested. An ANN base approach is one such method. Artificial Neural Network has been applied to many problems in image processing and has demonstrated their superiority over classical methods when dealing with noisy or incomplete data for image compression applications. The performance analysis of different images is proposed with an analysis of EZW coding system with Error Backpropagation algorithm. The implementation and analysis shows approximately 30% more accuracy in retrieved image compare to the existing EZW coding system.

Keywords: Accuracy, Compression, EZW, JPEG2000, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934
482 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data

Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad

Abstract:

Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars, and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.

Keywords: Remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2055
481 Hybrid Advanced Oxidative Pretreatment of Complex Industrial Effluent for Biodegradability Enhancement

Authors: K. Paradkar, S. N. Mudliar, A. Sharma, A. B. Pandit, R. A. Pandey

Abstract:

The study explores the hybrid combination of Hydrodynamic Cavitation (HC) and Subcritical Wet Air Oxidation-based pretreatment of complex industrial effluent to enhance the biodegradability selectively (without major COD destruction) to facilitate subsequent enhanced downstream processing via anaerobic or aerobic biological treatment. Advanced oxidation based techniques can be less efficient as standalone options and a hybrid approach by combining Hydrodynamic Cavitation (HC), and Wet Air Oxidation (WAO) can lead to a synergistic effect since both the options are based on common free radical mechanism. The HC can be used for initial turbulence and generation of hotspots which can begin the free radical attack and this agitating mixture then can be subjected to less intense WAO since initial heat (to raise the activation energy) can be taken care by HC alone. Lab-scale venturi-based hydrodynamic cavitation and wet air oxidation reactor with biomethanated distillery wastewater (BMDWW) as a model effluent was examined for establishing the proof-of-concept. The results indicated that for a desirable biodegradability index (BOD: COD - BI) enhancement (up to 0.4), the Cavitation (standalone) pretreatment condition was: 5 bar and 88 min reaction time with a COD reduction of 36 % and BI enhancement of up to 0.27 (initial BI - 0.17). The optimum WAO condition (standalone) was: 150oC, 6 bar and 30 minutes with 31% COD reduction and 0.33 BI. The hybrid pretreatment (combined Cavitation + WAO) worked out to be 23.18 min HC (at 5 bar) followed by 30 min WAO at 150oC, 6 bar, at which around 50% COD was retained yielding a BI of 0.55. FTIR & NMR analysis of pretreated effluent indicated dissociation and/or reorientation of complex organic compounds in untreated effluent to simpler organic compounds post-pretreatment.

Keywords: BI, hybrid, hydrodynamic cavitation, wet air oxidation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758
480 Characterisation of Wind-Driven Ventilation in Complex Terrain Conditions

Authors: Daniel Micallef, Damien Bounaudet, Robert N. Farrugia, Simon P. Borg, Vincent Buhagiar, Tonio Sant

Abstract:

The physical effects of upstream flow obstructions such as vegetation on cross-ventilation phenomena of a building are important for issues such as indoor thermal comfort. Modelling such effects in Computational Fluid Dynamics simulations may also be challenging. The aim of this work is to establish the cross-ventilation jet behaviour in such complex terrain conditions as well as to provide guidelines on the implementation of CFD numerical simulations in order to model complex terrain features such as vegetation in an efficient manner. The methodology consists of onsite measurements on a test cell coupled with numerical simulations. It was found that the cross-ventilation flow is highly turbulent despite the very low velocities encountered internally within the test cells. While no direct measurement of the jet direction was made, the measurements indicate that flow tends to be reversed from the leeward to the windward side. Modelling such a phenomenon proves challenging and is strongly influenced by how vegetation is modelled. A solid vegetation tends to predict better the direction and magnitude of the flow than a porous vegetation approach. A simplified terrain model was also shown to provide good comparisons with observation. The findings have important implications on the study of cross-ventilation in complex terrain conditions since the flow direction does not remain trivial, as with the traditional isolated building case.

Keywords: Complex terrain, cross-ventilation, wind driven ventilation, Computational Fluid Dynamics (CFD), wind resource.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 894