Search results for: business data type
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9756

Search results for: business data type

7086 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
7085 Investigation on Behavior of Fixed-Ended Reinforced Concrete Deep Beams

Authors: Y. Heyrani Birak, R. Hizaji, J. Shahkarami

Abstract:

Reinforced Concrete (RC) deep beams are special structural elements because of their geometry and behavior under loads. For example, assumption of strain- stress distribution is not linear in the cross section. These types of beams may have simple supports or fixed supports. A lot of research works have been conducted on simply supported deep beams, but little study has been done in the fixed-end RC deep beams behavior. Recently, using of fixed-ended deep beams has been widely increased in structures. In this study, the behavior of fixed-ended deep beams is investigated, and the important parameters in capacity of this type of beams are mentioned.

Keywords: Deep beam, capacity, reinforced concrete, fixed-ended.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 829
7084 A Proposed Trust Model for the Semantic Web

Authors: Hoda Waguih

Abstract:

A serious problem on the WWW is finding reliable information. Not everything found on the Web is true and the Semantic Web does not change that in any way. The problem will be even more crucial for the Semantic Web, where agents will be integrating and using information from multiple sources. Thus, if an incorrect premise is used due to a single faulty source, then any conclusions drawn may be in error. Thus, statements published on the Semantic Web have to be seen as claims rather than as facts, and there should be a way to decide which among many possibly inconsistent sources is most reliable. In this work, we propose a trust model for the Semantic Web. The proposed model is inspired by the use trust in human society. Trust is a type of social knowledge and encodes evaluations about which agents can be taken as reliable sources of information or services. Our proposed model allows agents to decide which among different sources of information to trust and thus act rationally on the semantic web.

Keywords: Semantic Web, Trust, Web of Trust, WWW.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1539
7083 Frequency Modulation in Vibro-Acoustic Modulation Method

Authors: D. Liu, D. M. Donskoy

Abstract:

The vibroacoustic modulation method is based on the modulation effect of high-frequency ultrasonic wave (carrier) by low-frequency vibration in the presence of various defects, primarily contact-type such as cracks, delamination, etc. The presence and severity of the defect are measured by the ratio of the spectral sidebands and the carrier in the spectrum of the modulated signal. This approach, however, does not differentiate between amplitude and frequency modulations, AM and FM, respectfully. This paper is an attempt to explain the generation mechanisms of FM and its correlation with the flaw properties. Here we proposed two possible mechanisms leading to FM modulation based on nonlinear local defect resonance and dynamic acoustoelastic models.

Keywords: Non-destructive testing, nonlinear acoustics, structural health monitoring, acoustoelasticity, local defect resonance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 502
7082 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model

Authors: Zina Benouaret, Djamil Aissani

Abstract:

In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.

Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 888
7081 Supplier Selection Criteria and Methods in Supply Chains: A Review

Authors: Om Pal, Amit Kumar Gupta, R. K. Garg

Abstract:

An effective supplier selection process is very important to the success of any manufacturing organization. The main objective of supplier selection process is to reduce purchase risk, maximize overall value to the purchaser, and develop closeness and long-term relationships between buyers and suppliers in today’s competitive industrial scenario. The literature on supplier selection criteria and methods is full of various analytical and heuristic approaches. Some researchers have developed hybrid models by combining more than one type of selection methods. It is felt that supplier selection criteria and method is still a critical issue for the manufacturing industries therefore in the present paper the literature has been thoroughly reviewed and critically analyzed to address the issue.

Keywords: Supplier selection, AHP, ANP, TOPSIS, Mathematical Programming

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28963
7080 An Approach for a Bidding Process Knowledge Capitalization

Authors: R. Chalal, A. R. Ghomari

Abstract:

Preparation and negotiation of innovative and future projects can be characterized as a strategic-type decision situation, involving many uncertainties and an unpredictable environment. We will focus in this paper on the bidding process. It includes cooperative and strategic decisions. Our approach for bidding process knowledge capitalization is aimed at information management in project-oriented organizations, based on the MUSIC (Management and Use of Co-operative Information Systems) model. We will show how to capitalize the company strategic knowledge and also how to organize the corporate memory. The result of the adopted approach is improvement of corporate memory quality.

Keywords: Bidding process, corporate memory, Knowledge capitalization, knowledge acquisition, strategic decisions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
7079 Logistical Optimization of Nuclear Waste Flows during Decommissioning

Authors: G. Dottavio, M. F. Andrade, F. Renard, V. Cheutet, A.-L. L. S. Vercraene, P. Hoang, S. Briet, R. Dachicourt, Y. Baizet

Abstract:

An important number of technological equipment and high-skilled workers over long periods of time have to be mobilized during nuclear decommissioning processes. The related operations generate complex flows of waste and high inventory levels, associated to information flows of heterogeneous types. Taking into account that more than 10 decommissioning operations are on-going in France and about 50 are expected toward 2025: A big challenge is addressed today. The management of decommissioning and dismantling of nuclear installations represents an important part of the nuclear-based energy lifecycle, since it has an environmental impact as well as an important influence on the electricity cost and therefore the price for end-users. Bringing new technologies and new solutions into decommissioning methodologies is thus mandatory to improve the quality, cost and delay efficiency of these operations. The purpose of our project is to improve decommissioning management efficiency by developing a decision-support framework dedicated to plan nuclear facility decommissioning operations and to optimize waste evacuation by means of a logistic approach. The target is to create an easy-to-handle tool capable of i) predicting waste flows and proposing the best decommissioning logistics scenario and ii) managing information during all the steps of the process and following the progress: planning, resources, delays, authorizations, saturation zones, waste volume, etc. In this article we present our results from waste nuclear flows simulation during decommissioning process, including discrete-event simulation supported by FLEXSIM 3-D software. This approach was successfully tested and our works confirms its ability to improve this type of industrial process by identifying the critical points of the chain and optimizing it by identifying improvement actions. This type of simulation, executed before the start of the process operations on the basis of a first conception, allow ‘what-if’ process evaluation and help to ensure quality of the process in an uncertain context. The simulation of nuclear waste flows before evacuation from the site will help reducing the cost and duration of the decommissioning process by optimizing the planning and the use of resources, transitional storage and expensive radioactive waste containers. Additional benefits are expected for the governance system of the waste evacuation since it will enable a shared responsibility of the waste flows.

Keywords: Nuclear decommissioning, logistical optimization, decision-support framework, waste management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556
7078 Literature-Based Discoveries in Lupus Treatment

Authors: Oluwaseyi Jaiyeoba, Vetria Byrd

Abstract:

Systemic lupus erythematosus (aka lupus) is a chronic disease known for its chameleon-like ability to mimic symptoms of other diseases rendering it hard to detect, diagnose and treat. The heterogeneous nature of the disease generates disparate data that are often multifaceted and multi-dimensional. Musculoskeletal manifestation of lupus is one of the most common clinical manifestations of lupus. This research links disparate literature on the treatment of lupus as it affects the musculoskeletal system using the discoveries from literature-based research articles available on the PubMed database. Several Natural Language Processing (NPL) tools exist to connect disjointed but related literature, such as Connected Papers, Bitola, and Gopalakrishnan. Literature-based discovery (LBD) has been used to bridge unconnected disciplines based on text mining procedures. The technical/medical literature consists of many technical/medical concepts, each having its  sub-literature. This approach has been used to link Parkinson’s, Raynaud, and Multiple Sclerosis treatment within works of literature.  Literature-based discovery methods can connect two or more related but disjointed literature concepts to produce a novel and plausible approach to solving a research problem. Data visualization techniques with the help of natural language processing tools are used to visually represent the result of literature-based discoveries. Literature search results can be voluminous, but Data visualization processes can provide insight and detect subtle patterns in large data. These insights and patterns can lead to discoveries that would have otherwise been hidden from disjointed literature. In this research, literature data are mined and combined with visualization techniques for heterogeneous data to discover viable treatments reported in the literature for lupus expression in the musculoskeletal system. This research answers the question of using literature-based discovery to identify potential treatments for a multifaceted disease like lupus. A three-pronged methodology is used in this research: text mining, natural language processing, and data visualization. These three research-related fields are employed to identify patterns in lupus-related data that, when visually represented, could aid research in the treatment of lupus. This work introduces a method for visually representing interconnections of various lupus-related literature. The methodology outlined in this work is the first step toward literature-based research and treatment planning for the musculoskeletal manifestation of lupus. The results also outline the interconnection of complex, disparate data associated with the manifestation of lupus in the musculoskeletal system. The societal impact of this work is broad. Advances in this work will improve the quality of life for millions of persons in the workforce currently diagnosed and silently living with a musculoskeletal disease associated with lupus.

Keywords: Systemic lupus erythematosus, LBD, Data Visualization, musculoskeletal system, treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 506
7077 Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Authors: Engin Yesil, Leon Urbas

Abstract:

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

Keywords: Big Bang-Big Crunch optimization, Dynamic Systems, Fuzzy Cognitive Maps, Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1841
7076 Wavelet Based Residual Method of Detecting GSM Signal Strength Fading

Authors: Danladi Ali, Onah Festus Iloabuchi

Abstract:

In this paper, GSM signal strength was measured in order to detect the type of the signal fading phenomenon using onedimensional multilevel wavelet residual method and neural network clustering to determine the average GSM signal strength received in the study area. The wavelet residual method predicted that the GSM signal experienced slow fading and attenuated with MSE of 3.875dB. The neural network clustering revealed that mostly -75dB, -85dB and -95dB were received. This means that the signal strength received in the study is a weak signal.

Keywords: One-dimensional multilevel wavelets, path loss, GSM signal strength, propagation and urban environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1958
7075 Post ERP Feral System and use of ‘Feral System as Coping Mechanism

Authors: Tajul Urus, S., Molla, A., Teoh, S.Y.

Abstract:

A number of studies highlighted problems related to ERP systems, yet, most of these studies focus on the problems during the project and implementation stages but not during the postimplementation use process. Problems encountered in the process of using ERP would hinder the effective exploitation and the extended and continued use of ERP systems and their value to organisations. This paper investigates the different types of problems users (operational, supervisory and managerial) faced in using ERP and how 'feral system' is used as the coping mechanism. The paper adopts a qualitative method and uses data collected from two cases and 26 interviews, to inductively develop a casual network model of ERP usage problem and its coping mechanism. This model classified post ERP usage problems as data quality, system quality, interface and infrastructure. The model is also categorised the different coping mechanism through use of 'feral system' inclusive of feral information system, feral data and feral use of technology.

Keywords: Case Studies, Coping Mechanism, Post Implementation ERP system, Usage Problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510
7074 Entrepreneur Features as a Competence in the Design of the European Higher Education Area Degrees

Authors: Herruzo E., Espejo R.A., Moreno R., González C., Benavides J.I., Plata, O.

Abstract:

This paper aims to explain the project carried out at the University of Cordoba, specifically at the High Polytechnic School in collaboration with two other organizations belonging to the Andalusian Ministry of Innovation, Science and Business: Andalusian Innovation and Development Agency (IDEA agency) [1] and the Territorial Net of Entrepreneurship Support (in Spanish Red Territorial de Apoyo al Emprendedor) [11]. The project is being developed in several stages of which only the first one has already been completed. However, several important preliminary results derive from it, based mainly in the description of the nature of entrepreneurship in the field of university education and its impact on student-s competency as recommended by the European Higher Education Area. Some problems holding back the correct future development will also be shown as derived from the specific context of application of the project.

Keywords: EHEA, Entrepreneurship, Innovation, TransversalCompetence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368
7073 Pt(IV) Complexes with Polystrene-bound Schiff Bases as Antimicrobial Agent: Synthesis and Characterization

Authors: Dilek Nartop, Nurşen Sarı, Hatice Öğütçü

Abstract:

Novel polystrene-bound Schiff bases and their Pt(IV) complexes have been prepared from condensation reaction of polystyrene-A-NH2 with 2-hydroxybenzaldehyde and 5-fluoro-3- bromo-2-hydroxybenzaldehyde. The structures of Pt(IV) complexes with polystyrene including Schiff bases have been determined by elemental analyses, magnetic susceptibility, IR, 1H-NMR, UV-vis, TG/DTA and AAS. The antibacterial and antifungal activities of the synthesized compounds have been studied by the well-diffusion method against some selected microorganisms: (Bacillus cereus spp., Listeria monocytogenes 4b, Micrococcus luteus, Staphylococcus aureus, Staphylococcus epidermis, Brucella abortus, Escherichia coli, Pseudomonas putida spp., Shigella dysenteria type 10, Salmonella typhi H).

Keywords: Polymer-bound Schiff bases, polystyrene-A-NH2, Pt(IV) complexes, biological activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1982
7072 Highly Flexible Modularized Sensor Platform

Authors: Kai-Chao Yang, Chun-Ming Huang, Chih-Chiao Yang, Chien-Ming Wu

Abstract:

Sensors have been used in various kinds of academic fields and applications. In this article, we propose the idea of modularized sensors that combine multiple sensor modules into a unique sensor. We divide a sensor into several units according to functionalities. Each unit has different sensor modules, which share the same type of connectors and can be serially and arbitrarily connected each other. A user can combine different sensor modules into a sensor platform according to requirements. Compared with current modularized sensors, the proposed sensor platform is highly flexible and reusable. We have implemented the prototype of the proposed sensor platform, and the experimental results show the proposed platform can work correctly.

Keywords: Sensor device, sensor fusion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
7071 Fermentative Production of Dextran using Food Industry Wastes

Authors: Marzieh Moosavi-Nasab, Mohsen Gavahian, Ali R. Yousefi, Hamed Askari

Abstract:

Dextran is a D-glucose polymer which is produced by Leuconostoc mesenteroides grown in a sucrose-rich media. The organism was obtained from the Persian Type Culture Collection (PTCC) and was transferred in MRS broth medium at 30°C and pH 6.8 for 24 h. After preparation of inoculums, organisms were inoculated into five liquid fermentation media containing either molasses or cheese whey or different combinations of cheese whey and molasses. After certain fermentation period, the produced dextran was separated and dried. Dextran yield was calculated and significant differences in different media were observed. Furthermore, FT-IR analysis was performed and the results showed that there were no significant differences in the produced dextran structures.

Keywords: Dextran, Leuconostoc mesenteroides, Molasses, Whey

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3671
7070 A Survey on Lossless Compression of Bayer Color Filter Array Images

Authors: Alina Trifan, António J. R. Neves

Abstract:

Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.

Keywords: Bayer images, CFA, losseless compression, image coding standards.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2545
7069 Solution of First kind Fredholm Integral Equation by Sinc Function

Authors: Khosrow Maleknejad, Reza Mollapourasl, Parvin Torabi, Mahdiyeh Alizadeh,

Abstract:

Sinc-collocation scheme is one of the new techniques used in solving numerical problems involving integral equations. This method has been shown to be a powerful numerical tool for finding fast and accurate solutions. So, in this paper, some properties of the Sinc-collocation method required for our subsequent development are given and are utilized to reduce integral equation of the first kind to some algebraic equations. Then convergence with exponential rate is proved by a theorem to guarantee applicability of numerical technique. Finally, numerical examples are included to demonstrate the validity and applicability of the technique.

Keywords: Integral equation, Fredholm type, Collocation method, Sinc approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2755
7068 A Study of Behaviors in Using Social Networks of Corporate Personnel of Suan Sunandha Rajabhat University

Authors: Wipada Chiawchan

Abstract:

This study found that most corporate personnel are using social media to communicate with colleagues to make the process of working more efficient. Complete satisfaction occurred on the use of security within the University’s computer network. The social network usage for communication, collaboration, entertainment and demonstrating concerns accounted for fifty percent of variance to predict interpersonal relationships of corporate personnel. This evaluation on the effectiveness of social networking involved 213 corporate personnel’s. The data was collected by questionnaires. This data was analyzed by using percentage, mean, and standard deviation. The results from the analysis and the effectiveness of using online social networks were derived from the attitude of private users and safety data within the security system. The results showed that the effectiveness on the use of an online social network for corporate personnel of Suan Sunandha Rajabhat University was specifically at a good level, and the overall effects of each aspect was (Ẋ=3.11).

Keywords: Behaviors, Social Media, Social Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394
7067 Investigation of 5,10,15,20-Tetrakis(3-,5--Di-Tert-Butylphenyl)Porphyrinatocopper(II) for Electronics Applications

Authors: Zubair Ahmad, M. H. Sayyad, M. Yaseen, M. Ali

Abstract:

In this work, an organic compound 5,10,15,20- Tetrakis(3,5-di-tertbutylphenyl)porphyrinatocopper(II) (TDTBPPCu) is studied as an active material for thin film electronic devices. To investigate the electrical properties of TDTBPPCu, junction of TDTBPPCu with heavily doped n-Si and Al is fabricated. TDTBPPCu film was sandwiched between Al and n-Si electrodes. Various electrical parameters of TDTBPPCu are determined. The current-voltage characteristics of the junction are nonlinear, asymmetric and show rectification behavior, which gives the clue of formation of depletion region. This behavior indicates the potential of TDTBPPCu for electronics applications. The current-voltage and capacitance-voltage techniques are used to find the different electronic parameters.

Keywords: P-type, organic semiconductor, Electricalcharacteristics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1352
7066 A Study on the Least Squares Reduced Parameter Approximation of FIR Digital Filters

Authors: S. Seyedtabaii, E. Seyedtabaii

Abstract:

Rounding of coefficients is a common practice in hardware implementation of digital filters. Where some coefficients are very close to zero or one, as assumed in this paper, this rounding action also leads to some computation reduction. Furthermore, if the discarded coefficient is of high order, a reduced order filter is obtained, otherwise the order does not change but computation is reduced. In this paper, the Least Squares approximation to rounded (or discarded) coefficient FIR filter is investigated. The result also succinctly extended to general type of FIR filters.

Keywords: Digital filter, filter approximation, least squares, model order reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601
7065 Exploring Social Impact of Emerging Technologies from Futuristic Data

Authors: Heeyeul Kwon, Yongtae Park

Abstract:

Despite the highly touted benefits, emerging technologies have unleashed pervasive concerns regarding unintended and unforeseen social impacts. Thus, those wishing to create safe and socially acceptable products need to identify such side effects and mitigate them prior to the market proliferation. Various methodologies in the field of technology assessment (TA), namely Delphi, impact assessment, and scenario planning, have been widely incorporated in such a circumstance. However, literatures face a major limitation in terms of sole reliance on participatory workshop activities. They unfortunately missed out the availability of a massive untapped data source of futuristic information flooding through the Internet. This research thus seeks to gain insights into utilization of futuristic data, future-oriented documents from the Internet, as a supplementary method to generate social impact scenarios whilst capturing perspectives of experts from a wide variety of disciplines. To this end, network analysis is conducted based on the social keywords extracted from the futuristic documents by text mining, which is then used as a guide to produce a comprehensive set of detailed scenarios. Our proposed approach facilitates harmonized depictions of possible hazardous consequences of emerging technologies and thereby makes decision makers more aware of, and responsive to, broad qualitative uncertainties.

Keywords: Emerging technologies, futuristic data, scenario, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2392
7064 Volatility Model with Markov Regime Switching to Forecast Baht/USD

Authors: N. Sopipan, A. Intarasit, K. Chuarkham

Abstract:

 In this paper, we forecast the volatility of Baht/USDs using Markov Regime Switching GARCH (MRS-GARCH) models. These models allow volatility to have different dynamics according to unobserved regime variables. The main purpose of this paper is to find out whether MRS-GARCH models are an improvement on the GARCH type models in terms of modeling and forecasting Baht/USD volatility. The MRS-GARCH is the best performance model for Baht/USD volatility in short term but the GARCH model is best perform for long term.

Keywords: Volatility, Markov Regime Switching, Forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
7063 A Vehicle Monitoring System Based on the LoRa Technique

Authors: Chao-Linag Hsieh, Zheng-Wei Ye, Chen-Kang Huang, Yeun-Chung Lee, Chih-Hong Sun, Tzai-Hung Wen, Jehn-Yih Juang, Joe-Air Jiang

Abstract:

Air pollution and climate warming become more and more intensified in many areas, especially in urban areas. Environmental parameters are critical information to air pollution and weather monitoring. Thus, it is necessary to develop a suitable air pollution and weather monitoring system for urban areas. In this study, a vehicle monitoring system (VMS) based on the IoT technique is developed. Cars are selected as the research tool because it can reach a greater number of streets to collect data. The VMS can monitor different environmental parameters, including ambient temperature and humidity, and air quality parameters, including PM2.5, NO2, CO, and O3. The VMS can provide other information, including GPS signals and the vibration information through driving a car on the street. Different sensor modules are used to measure the parameters and collect the measured data and transmit them to a cloud server through the LoRa protocol. A user interface is used to show the sensing data storing at the cloud server. To examine the performance of the system, a researcher drove a Nissan x-trail 1998 to the area close to the Da’an District office in Taipei to collect monitoring data. The collected data are instantly shown on the user interface. The four kinds of information are provided by the interface: GPS positions, weather parameters, vehicle information, and air quality information. With the VMS, users can obtain the information regarding air quality and weather conditions when they drive their car to an urban area. Also, government agencies can make decisions on traffic planning based on the information provided by the proposed VMS.

Keywords: Vehicle, monitoring system, LoRa, smart city.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3102
7062 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: Channel estimation, inter-cell interference, pilot contamination attacks, wireless communications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 677
7061 A User Friendly Tool for Performance Evaluation of Different Reference Evapotranspiration Methods

Authors: Vijay Shankar

Abstract:

Evapotranspiration (ET) is a major component of the hydrologic cycle and its accurate estimation is essential for hydrological studies. In past, various estimation methods have been developed for different climatological data, and the accuracy of these methods varies with climatic conditions. Reference crop evapotranspiration (ET0) is a key variable in procedures established for estimating evapotranspiration rates of agricultural crops. Values of ET0 are used with crop coefficients for many aspects of irrigation and water resources planning and management. Numerous methods are used for estimating ET0. As per internationally accepted procedures outlined in the United Nations Food and Agriculture Organization-s Irrigation and Drainage Paper No. 56(FAO-56), use of Penman-Monteith equation is recommended for computing ET0 from ground based climatological observations. In the present study, seven methods have been selected for performance evaluation. User friendly software has been developed using programming language visual basic. The visual basic has ability to create graphical environment using less coding. For given data availability the developed software estimates reference evapotranspiration for any given area and period for which data is available. The accuracy of the software has been checked by the examples given in FAO-56.The developed software is a user friendly tool for estimating ET0 under different data availability and climatic conditions.

Keywords: Crop coefficient, Crop evapotranspiration, Field moisture, Irrigation Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
7060 Levenberg-Marquardt Algorithm for Karachi Stock Exchange Share Rates Forecasting

Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil

Abstract:

Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.

Keywords: Gradient descent method, jacobian matrix.Levenberg-Marquardt algorithm, quadratic error surfaces,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2474
7059 Analyzing of Public Transport Trip Generation in Developing Countries; A Case Study in Yogyakarta, Indonesia

Authors: S. Priyanto, E.P Friandi

Abstract:

Yogyakarta, as the capital city of Yogyakarta Province, has important roles in various sectors that require good provision of public transportation system. Ideally, a good transportation system should be able to accommodate the amount of travel demand. This research attempts to develop a trip generation model to predict the number of public transport passenger in Yogyakarta city. The model is built by using multiple linear regression analysis, which establishes relationship between trip number and socioeconomic attributes. The data consist of primary and secondary data. Primary data was collected by conducting household surveys which randomly selected. The resulted model is further applied to evaluate the existing TransJogja, a new Bus Rapid Transit system serves Yogyakarta and surrounding cities, shelters.

Keywords: Multiple linear regression, shelter evaluation, travel demand, trip generation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2201
7058 PZ: A Z-based Formalism for Modeling Probabilistic Behavior

Authors: Hassan Haghighi

Abstract:

Probabilistic techniques in computer programs are becoming more and more widely used. Therefore, there is a big interest in the formal specification, verification, and development of probabilistic programs. In our work-in-progress project, we are attempting to make a constructive framework for developing probabilistic programs formally. The main contribution of this paper is to introduce an intermediate artifact of our work, a Z-based formalism called PZ, by which one can build set theoretical models of probabilistic programs. We propose to use a constructive set theory, called CZ set theory, to interpret the specifications written in PZ. Since CZ has an interpretation in Martin-L¨of-s theory of types, this idea enables us to derive probabilistic programs from correctness proofs of their PZ specifications.

Keywords: formal specification, formal program development, probabilistic programs, CZ set theory, type theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1203
7057 Clustering Protein Sequences with Tailored General Regression Model Technique

Authors: G. Lavanya Devi, Allam Appa Rao, A. Damodaram, GR Sridhar, G. Jaya Suma

Abstract:

Cluster analysis divides data into groups that are meaningful, useful, or both. Analysis of biological data is creating a new generation of epidemiologic, prognostic, diagnostic and treatment modalities. Clustering of protein sequences is one of the current research topics in the field of computer science. Linear relation is valuable in rule discovery for a given data, such as if value X goes up 1, value Y will go down 3", etc. The classical linear regression models the linear relation of two sequences perfectly. However, if we need to cluster a large repository of protein sequences into groups where sequences have strong linear relationship with each other, it is prohibitively expensive to compare sequences one by one. In this paper, we propose a new technique named General Regression Model Technique Clustering Algorithm (GRMTCA) to benignly handle the problem of linear sequences clustering. GRMT gives a measure, GR*, to tell the degree of linearity of multiple sequences without having to compare each pair of them.

Keywords: Clustering, General Regression Model, Protein Sequences, Similarity Measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1567