Search results for: Data Reduction
6831 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility
Authors: Alejandro Villegas, Cihan Varol
Abstract:
Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.
Keywords: Betamax, digital forensics, report utility, VoIP, VoIP Buster, VoIPWise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31226830 An Improved Preprocessing for Biosonar Target Classification
Authors: Turgay Temel, John Hallam
Abstract:
An improved processing description to be employed in biosonar signal processing in a cochlea model is proposed and examined. It is compared to conventional models using a modified discrimination analysis and both are tested. Their performances are evaluated with echo data captured from natural targets (trees).Results indicate that the phase characteristics of low-pass filters employed in the echo processing have a significant effect on class separability for this data.
Keywords: Cochlea model, discriminant analysis, neurospikecoding, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14916829 Developing Improvements to Multi-Hazard Risk Assessments
Authors: A. Fathianpour, M. B. Jelodar, S. Wilkinson
Abstract:
This paper outlines the approaches taken to assess multi-hazard assessments. There is currently confusion in assessing multi-hazard impacts, and so this study aims to determine which of the available options are the most useful. The paper uses an international literature search, and analysis of current multi-hazard assessments and a case study to illustrate the effectiveness of the chosen method. Findings from this study will help those wanting to assess multi-hazards to undertake a straightforward approach. The paper is significant as it helps to interpret the various approaches and concludes with the preferred method. Many people in the world live in hazardous environments and are susceptible to disasters. Unfortunately, when a disaster strikes it is often compounded by additional cascading hazards, thus people would confront more than one hazard simultaneously. Hazards include natural hazards (earthquakes, floods, etc.) or cascading human-made hazards (for example, Natural Hazard Triggering Technological disasters (Natech) such as fire, explosion, toxic release). Multi-hazards have a more destructive impact on urban areas than one hazard alone. In addition, climate change is creating links between different disasters such as causing landslide dams and debris flows leading to more destructive incidents. Much of the prevailing literature deals with only one hazard at a time. However, recently sophisticated multi-hazard assessments have started to appear. Given that multi-hazards occur, it is essential to take multi-hazard risk assessment under consideration. This paper aims to review the multi-hazard assessment methods through articles published to date and categorize the strengths and disadvantages of using these methods in risk assessment. Napier City is selected as a case study to demonstrate the necessity of using multi-hazard risk assessments. In order to assess multi-hazard risk assessments, first, the current multi-hazard risk assessment methods were described. Next, the drawbacks of these multi-hazard risk assessments were outlined. Finally, the improvements to current multi-hazard risk assessments to date were summarised. Generally, the main problem of multi-hazard risk assessment is to make a valid assumption of risk from the interactions of different hazards. Currently, risk assessment studies have started to assess multi-hazard situations, but drawbacks such as uncertainty and lack of data show the necessity for more precise risk assessment. It should be noted that ignoring or partial considering multi-hazards in risk assessment will lead to an overestimate or overlook in resilient and recovery action managements.
Keywords: Cascading hazards, multi-hazard, risk assessment, risk reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10936828 Optimum Operating Conditions for Direct Oxidation of H2S in a Fluidized Bed Reactor
Authors: Fahimeh Golestani, Mohammad Kazemeini, Moslem Fattahi, Ali Amjadian
Abstract:
In this research a mathematical model for direct oxidization of hydrogen sulfide into elemental sulfur in a fluidized bed reactor with external circulation was developed. As the catalyst is deactivated in the fluidized bed, it might be placed in a reduction tank in order to remove sulfur through heating above its dew point. The reactor model demonstrated via MATLAB software. It was shown that variations of H2S conversion as well as; products formed were reasonable in comparison with corresponding results of a fixed bed reactor. Through analyzing results of this model, it became possible to propose the main optimized operating conditions for the process considered. These conditions included; the temperature range of 100-130ºC and utilizing the catalyst as much as possible providing the highest bed density respect to dimensions of bed, economical aspects that the bed ever remained in fluidized mode. A high active and stable catalyst under the optimum conditions exhibited 100% conversion in a fluidized bed reactor.Keywords: Direct oxidization, Fluidized bed, H2S, Mathematical modeling, Optimum conditions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18786827 Phytotoxicity of Daphne Gnidium L. Occurring in Tunisia
Authors: Ladhari A., Omezzine F., Rinez A., Haouala R.
Abstract:
Phytotoxicity of Daphne gnidium L. was evaluated through the effect of incorporating leaves, stems and roots biomass into soil (at 12.5, 25, 50g/Kg) and irrigation by their aqueous extracts (50g/L), on the growth of two crops (Lactuca sativa L. and Raphanus sativus L.) and two weeds (Peaganum harmala L. and Scolymus maculatus L.). Results revealed a perceptible phytotoxic effect which increased with dose and concentration. At the highest dose, roots and leaves residues was the most toxic and caused total inhibition respectively, for lettuce and thistle seedling growth. Irrigation with aqueous extracts of D. gnidium different organs decreased also seedlings length of all test species. Stems extract was more inhibitor on thistle than peganum seedling growth; it induced a significant reduction of 80% and 67%, for, respectively, roots and shoots. Results of the present study suggest that different organs of D. gnidium could be exploited in the management of agro-ecosystems.Keywords: Biomass, Daphne gnidium L., phytoxicity, seedlinggrowth
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19956826 Peace through Environmental Stewardship
Authors: Elizabeth D. Ramos
Abstract:
Peace education supports a holistic appreciation for the value of life and the interdependence of all living systems. Peace education aims to build a culture of peace. One way of building a culture of peace is through environmental stewardship. This study sought to find out the environmental stewardship practices in selected Higher Education Institutions (HEIs) in the Philippines and how these environmental stewardship practices lead to building a culture of peace. The findings revealed that there is still room for improvement in implementing environmental stewardship in schools through academic service learning. In addition, the following manifestations are implemented very satisfactorily in schools: 1) waste reduction, reuse, and recycling, 2) community service, and 3) clean and green surroundings. Administrators of schools in the study lead their staff and students in implementing environmental stewardship. It could be concluded that those involved in environmental stewardship display an acceptable culture of peace, particularly solidarity, respect for persons, and inner peace.
Keywords: Academic service learning, environmental stewardship, leadership support, peace, solidarity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35386825 A Consideration of the Achievement of Productive Level Parallel Programming Skills
Authors: Tadayoshi Horita, Masakazu Akiba, Mina Terauchi, Tsuneo Kanno
Abstract:
This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.
Keywords: Parallel computing, programming education, GPU, GPGPU, CUDA, OpenCL, MPI, Cell.B.E.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16856824 Impediments to Female Sports Management and Participation: The Experience in the Selected Nigeria South West Colleges of Education
Authors: Saseyi Olaitan Olaoluwa, Osifeko Olalekan Remigious
Abstract:
The study was meant to identify the impediments to female sports management and participation in the selected colleges. Seven colleges of education in the south west parts of the country were selected for the study. A total of one hundred and five subjects were sampled to supply data. Only one hundred adequately completed and returned, copies of the questionnaire were used for data analysis. The collected data were analysed descriptively. The result of the study showed that inadequate fund, personnel, facilities equipment, supplies, management of sports, supervision and coaching were some of the impediments to female sports management and participation. Athletes were not encouraged to participate. Based on the findings, it was recommended that the government should come to the aid of the colleges by providing fund and other needs that will make sports attractive for enhanced participation.Keywords: Female sports, impediments, management, Nigeria, south west, colleges.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16676823 New Strategy Agents to Improve Power System Transient Stability
Authors: Mansour A. Mohamed, George G. Karady, Ali M. Yousef
Abstract:
This paper proposes transient angle stability agents to enhance power system stability. The proposed transient angle stability agents divided into two strategy agents. The first strategy agent is a prediction agent that will predict power system instability. According to the prediction agent-s output, the second strategy agent, which is a control agent, is automatically calculating the amount of active power reduction that can stabilize the system and initiating a control action. The control action considered is turbine fast valving. The proposed strategies are applied to a realistic power system, the IEEE 50- generator system. Results show that the proposed technique can be used on-line for power system instability prediction and control.Keywords: Multi-agents, Fast Valving, Power System Transient Stability, Prediction methods,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18526822 Combined Sewer Overflow forecasting with Feed-forward Back-propagation Artificial Neural Network
Authors: Achela K. Fernando, Xiujuan Zhang, Peter F. Kinley
Abstract:
A feed-forward, back-propagation Artificial Neural Network (ANN) model has been used to forecast the occurrences of wastewater overflows in a combined sewerage reticulation system. This approach was tested to evaluate its applicability as a method alternative to the common practice of developing a complete conceptual, mathematical hydrological-hydraulic model for the sewerage system to enable such forecasts. The ANN approach obviates the need for a-priori understanding and representation of the underlying hydrological hydraulic phenomena in mathematical terms but enables learning the characteristics of a sewer overflow from the historical data. The performance of the standard feed-forward, back-propagation of error algorithm was enhanced by a modified data normalizing technique that enabled the ANN model to extrapolate into the territory that was unseen by the training data. The algorithm and the data normalizing method are presented along with the ANN model output results that indicate a good accuracy in the forecasted sewer overflow rates. However, it was revealed that the accurate forecasting of the overflow rates are heavily dependent on the availability of a real-time flow monitoring at the overflow structure to provide antecedent flow rate data. The ability of the ANN to forecast the overflow rates without the antecedent flow rates (as is the case with traditional conceptual reticulation models) was found to be quite poor.Keywords: Artificial Neural Networks, Back-propagationlearning, Combined sewer overflows, Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15296821 Phytotoxicity of Lead on the Physiological Parameters of Two Varieties of Broad Bean (Vicia faba)
Authors: El H. Bouziani, H. A. Reguieg Yssaad
Abstract:
The phytotoxicity of heavy metals can be expressed on roots and visible part of plants and is characterized by molecular and metabolic answers at various levels of organization of the whole plant. The present study was undertaken on two varieties of broad bean Vicia faba (Sidi Aïch and Super Aguadulce). The device was mounted on a substrate prepared by mixing sand, soil and compost, the substrate was artificially contaminated with three doses of lead nitrate [Pb(NO3)2] 0, 500 and 1000 ppm. Our objective is to follow the behavior of plant opposite the stress by evaluating the physiological parameters. The results reveal a reduction in the parameters of the productivity (chlorophyll and proteins production) with an increase in the osmoregulators (soluble sugars and proline).These results show that the production of broad bean is strongly modified by the disturbance of its internal physiology under lead exposure.Keywords: Broad bean, lead, stress, physiological parameters, phytotoxicity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15136820 Global Security Using Human Face Understanding under Vision Ubiquitous Architecture System
Abstract:
Different methods containing biometric algorithms are presented for the representation of eigenfaces detection including face recognition, are identification and verification. Our theme of this research is to manage the critical processing stages (accuracy, speed, security and monitoring) of face activities with the flexibility of searching and edit the secure authorized database. In this paper we implement different techniques such as eigenfaces vector reduction by using texture and shape vector phenomenon for complexity removal, while density matching score with Face Boundary Fixation (FBF) extracted the most likelihood characteristics in this media processing contents. We examine the development and performance efficiency of the database by applying our creative algorithms in both recognition and detection phenomenon. Our results show the performance accuracy and security gain with better achievement than a number of previous approaches in all the above processes in an encouraging mode.Keywords: Ubiquitous architecture, verification, Identification, recognition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13356819 Preparation of Metallic Copper Nanoparticles by Reduction of Copper Ions in Aqueous Solution and Their Metal-Metal Bonding Properties
Authors: Y. Kobayashi, T. Shirochi, Y. Yasuda, T. Morita
Abstract:
This paper describes a method for preparing metallic Cu nanoparticles in aqueous solution, and a metal-metal bonding technique using the Cu particles.Preparation of the Cu particle colloid solution was performed in water at room temperature in air using a copper source (0.01 M Cu(NO3)2), a reducing reagent (0.2 - 1.0 M hydrazine), and stabilizers (0.5×10-3 M citric acid and 5.0×10-3 M cetyltrimethylammonium bromide). The metallic Cu nanoparticles with sizes of ca. 60nm were prepared at all the hydrazine concentrations examined. A stage and a plate of metallic Cu were successfully bonded under annealing at 400oC and pressurizing at 1.2 MPa for 5min in H2 gas with help of the metallic Cu particles. A shear strength required for separating the bonded Cu substrates reached the maximum value at a hydrazine concentration of 0.8M, and it decreased beyond the concentration. Consequently, the largest shear strength of 22.9 MPa was achieved at the 0.8 M hydrazine concentration.
Keywords: Aqueous solution, Bonding, Colloid, Copper, Nanoparticle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 56556818 Seismic Response Reduction of Structures using Smart Base Isolation System
Authors: H.S. Kim
Abstract:
In this study, control performance of a smart base isolation system consisting of a friction pendulum system (FPS) and a magnetorheological (MR) damper has been investigated. A fuzzy logic controller (FLC) is used to modulate the MR damper so as to minimize structural acceleration while maintaining acceptable base displacement levels. To this end, a multi-objective optimization scheme is used to optimize parameters of membership functions and find appropriate fuzzy rules. To demonstrate effectiveness of the proposed multi-objective genetic algorithm for FLC, a numerical study of a smart base isolation system is conducted using several historical earthquakes. It is shown that the proposed method can find optimal fuzzy rules and that the optimized FLC outperforms not only a passive control strategy but also a human-designed FLC and a conventional semi-active control algorithm.Keywords: Fuzzy logic controller, genetic algorithm, MR damper, smart base isolation system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21996817 Remote Sensing, GIS, and AHP for Assessing Physical Vulnerability to Tsunami Hazard
Authors: Abu Bakar Sambah, Fusanori Miura
Abstract:
Remote sensing image processing, spatial data analysis through GIS approach, and analytical hierarchy process were introduced in this study for assessing the vulnerability area and inundation area due to tsunami hazard in the area of Rikuzentakata, Iwate Prefecture, Japan. Appropriate input parameters were derived from GSI DEM data, ALOS AVNIR-2, and field data. We used the parameters of elevation, slope, shoreline distance, and vegetation density. Five classes of vulnerability were defined and weighted via pairwise comparison matrix. The assessment results described that 14.35km2 of the study area was under tsunami vulnerability zone. Inundation areas are those of high and slightly high vulnerability. The farthest area reached by a tsunami was about 7.50km from the shoreline and shows that rivers act as flooding strips that transport tsunami waves into the hinterland. This study can be used for determining a priority for land-use planning in the scope of tsunami hazard risk management.
Keywords: AHP, GIS, remote sensing, tsunami vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33346816 Treatment of Simulated Textile Wastewater Containing Reactive Azo Dyes Using Laboratory Scale Trickling Filter
Authors: A. Irum, S. Mumtaz, A. Rehman, I. Naz, S. Ahmed
Abstract:
The present study was conducted to evaluate the potential applicability of biological trickling filter system for the treatment of simulated textile wastewater containing reactive azo dyes with bacterial consortium under non-sterile conditions. The percentage decolorization for the treatment of wastewater containing structurally different dyes was found to be higher than 95% in all trials. The stable bacterial count of the biofilm on stone media of the trickling filter during the treatment confirmed the presence, proliferation, dominance and involvement of the added microbial consortium in the treatment of textile wastewater. Results of physicochemical parameters revealed the reduction in chemical oxygen demand (58.5-75.1%), sulphates (18.9-36.5%), and phosphates (63.6-73.0%). UV-Visible and FTIR spectroscopy confirmed decolorization of dye containing wastewater was ultimate consequence of biodegradation. Toxicological studies revealed the nontoxic nature of degradative metabolites.
Keywords: Biodegradation, textile dyes, waste water, trickling filters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32786815 Investigation of Inert Gas Injection in Steam Reforming of Methane: Energy
Authors: Amjad Riaz, Ali Farsi, Gholamreza Zahedi, Zainuddin Abdul Manan
Abstract:
Synthesis gas manufacturing by steam reforming of hydrocarbons is an important industrial process. High endothermic nature of the process makes it one of the most cost and heat intensive processes. In the present work, composite effect of different inert gases on synthesis gas yield, feed gas conversion and temperature distribution along the reactor length has been studied using a heterogeneous model. Mathematical model was developed as a first stage and validated against the existing process models. With the addition of inert gases, a higher yield of synthesis gas is observed. Simultaneously the rector outlet temperature drops to as low as 810 K. It was found that Xenon gives the highest yield and conversion while Helium gives the lowest temperature. Using Xenon inert gas 20 percent reduction in outlet temperature was observed compared to traditional case.
Keywords: Energy savings, Inert gas, Methane, Modeling, Steam reforming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17136814 Energy Management System and Interactive Functions of Smart Plug for Smart Home
Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya
Abstract:
Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.Keywords: Energy management, load profile, smart plug, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13966813 Model-Based Software Regression Test Suite Reduction
Authors: Shiwei Deng, Yang Bao
Abstract:
In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.Keywords: Dependence analysis, EFSM model, greedy algorithm, regression test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19206812 Reliability Analysis of Press Unit using Vague Set
Authors: S. P. Sharma, Monica Rani
Abstract:
In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.
Keywords: Lambda -Tau methodology, Petri nets, repairable system, vague fuzzy set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15266811 Effect of Open Burning on Soil Carbon Stock in Sugarcane Plantation in Thailand
Authors: Wilaiwan Sornpoon, Sébastien Bonnet, Savitri Garivait
Abstract:
Open burning of sugarcane fields is recognized to have a negative impact on soil by degrading its properties, especially soil organic carbon (SOC) content. Better understating the effect of open burning on soil carbon dynamics is crucial for documenting the carbon sequestration capacity of agricultural soils. In this study, experiments to investigate soil carbon stocks under burned and unburned sugarcane plantation systems in Thailand were conducted. The results showed that cultivation fields without open burning during 5 consecutive years enabled to increase the SOC content at a rate of 1.37 Mg ha-1y-1. Also it was found that sugarcane fields burning led to about 15% reduction of the total carbon stock in the 0-30 cm soil layer. The overall increase in SOC under unburned practice is mainly due to the large input of organic material through the use of sugarcane residues.
Keywords: Soil organic carbon, Soil inorganic carbon, Carbon sequestration, Open burning, Sugarcane.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33836810 A New Scheme for Improving the Quality of Service in Heterogeneous Wireless Network for Data Stream Sending
Authors: Ebadollah Zohrevandi, Rasoul Roustaei, Omid Moradtalab
Abstract:
In this paper, we first consider the quality of service problems in heterogeneous wireless networks for sending the video data, which their problem of being real-time is pronounced. At last, we present a method for ensuring the end-to-end quality of service at application layer level for adaptable sending of the video data at heterogeneous wireless networks. To do this, mechanism in different layers has been used. We have used the stop mechanism, the adaptation mechanism and the graceful degrade at the application layer, the multi-level congestion feedback mechanism in the network layer and connection cutting off decision mechanism in the link layer. At the end, the presented method and the achieved improvement is simulated and presented in the NS-2 software.Keywords: Congestion, Handoff, Heterogeneous wireless networks, Adaptation mechanism, Stop mechanism, Graceful degrade.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14226809 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.
Keywords: Analytics, digitization, industry 4.0, manufacturing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7326808 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.
Keywords: Conditional Generative Adversarial Net, market and credit risk management, neural network, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11986807 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework
Authors: J. Grira, Y. Bédard, S. Roche
Abstract:
The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16656806 Grammatically Coded Corpus of Spoken Lithuanian: Methodology and Development
Authors: L. Kamandulytė-Merfeldienė
Abstract:
The paper deals with the main issues of methodology of the Corpus of Spoken Lithuanian which was started to be developed in 2006. At present, the corpus consists of 300,000 grammatically annotated word forms. The creation of the corpus consists of three main stages: collecting the data, the transcription of the recorded data, and the grammatical annotation. Collecting the data was based on the principles of balance and naturality. The recorded speech was transcribed according to the CHAT requirements of CHILDES. The transcripts were double-checked and annotated grammatically using CHILDES. The development of the Corpus of Spoken Lithuanian has led to the constant increase in studies on spontaneous communication, and various papers have dealt with a distribution of parts of speech, use of different grammatical forms, variation of inflectional paradigms, distribution of fillers, syntactic functions of adjectives, the mean length of utterances.
Keywords: CHILDES, Corpus of Spoken Lithuanian, grammatical annotation, grammatical disambiguation, lexicon, Lithuanian.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9476805 Data Envelopment Analysis under Uncertainty and Risk
Authors: P. Beraldi, M. E. Bruni
Abstract:
Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19666804 C-V Characterization and Analysis of Temperature and Channel Thickness Effects on Threshold Voltage of Ultra-thin SOI MOSFET by Self-Consistent Model
Authors: Shuvro Chowdhury, Esmat Farzana, Rizvi Ahmed, A. T. M. Golam Sarwar, M. Ziaur Rahman Khan
Abstract:
The threshold voltage and capacitance voltage characteristics of ultra-thin Silicon-on-Insulator MOSFET are greatly influenced by the thickness and doping concentration of the silicon film. In this work, the capacitance voltage characteristics and threshold voltage of the device have been analyzed with quantum mechanical effects using the Self-Consistent model. Reduction of channel thickness and adding doping impurities cause an increase in the threshold voltage. Moreover, the temperature effects cause a significant amount of threshold voltage shift. The temperature dependence of threshold voltage has also been observed with Self- Consistent approach which are well supported from experimental performance of practical devices.
Keywords: C-V characteristics, Self-Consistent Analysis, Siliconon-Insulator, Ultra-thin film.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26966803 Signed Approach for Mining Web Content Outliers
Authors: G. Poonkuzhali, K.Thiagarajan, K.Sarukesi, G.V.Uma
Abstract:
The emergence of the Internet has brewed the revolution of information storage and retrieval. As most of the data in the web is unstructured, and contains a mix of text, video, audio etc, there is a need to mine information to cater to the specific needs of the users without loss of important hidden information. Thus developing user friendly and automated tools for providing relevant information quickly becomes a major challenge in web mining research. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent ones that are likely to contain outlying data such as noise, irrelevant and redundant data. This paper mainly focuses on Signed approach and full word matching on the organized domain dictionary for mining web content outliers. This Signed approach gives the relevant web documents as well as outlying web documents. As the dictionary is organized based on the number of characters in a word, searching and retrieval of documents takes less time and less space.Keywords: Outliers, Relevant document, , Signed Approach, Web content mining, Web documents..
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23486802 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region
Authors: Mohammad Bakhshi, Firas Al Janabi
Abstract:
High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.
Keywords: DiMoN tool, disaggregation, exceedance probability, Kolmogorov-Smirnov Test, rainfall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1006