Search results for: health information
1474 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit
Authors: Ahmed Elrewainy
Abstract:
Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.
Keywords: Basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8471473 An Algorithm for Secure Visible Logo Embedding and Removing in Compression Domain
Authors: Hongyuan Li, Guang Liu, Yuewei Dai, Zhiquan Wang
Abstract:
Digital watermarking is the process of embedding information into a digital signal which can be used in DRM (digital rights managements) system. The visible watermark (often called logo) can indicate the owner of the copyright which can often be seen in the TV program and protects the copyright in an active way. However, most of the schemes do not consider the visible watermark removing process. To solve this problem, a visible watermarking scheme with embedding and removing process is proposed under the control of a secure template. The template generates different version of watermarks which can be seen visually the same for different users. Users with the right key can completely remove the watermark and recover the original image while the unauthorized user is prevented to remove the watermark. Experiment results show that our watermarking algorithm obtains a good visual quality and is hard to be removed by the illegally users. Additionally, the authorized users can completely remove the visible watermark and recover the original image with a good quality.Keywords: digital watermarking, visible and removablewatermark, secure template, JPEG compression
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15421472 Towards a Compliance Reporting using a Balanced Scorecard
Authors: Michael Amberg, Dipl. Kfm. Johannes C. Panitz
Abstract:
Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.Keywords: Balanced Scorecard, Compliance, ComplianceReporting, Compliance Scorecard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33761471 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot
Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan
Abstract:
With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.Keywords: Service Robot, Object Recognition, 3D Sensors, Plane Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16861470 Fault Localization and Alarm Correlation in Optical WDM Networks
Authors: G. Ramesh, S. Sundara Vadivelu
Abstract:
For several high speed networks, providing resilience against failures is an essential requirement. The main feature for designing next generation optical networks is protecting and restoring high capacity WDM networks from the failures. Quick detection, identification and restoration make networks more strong and consistent even though the failures cannot be avoided. Hence, it is necessary to develop fast, efficient and dependable fault localization or detection mechanisms. In this paper we propose a new fault localization algorithm for WDM networks which can identify the location of a failure on a failed lightpath. Our algorithm detects the failed connection and then attempts to reroute data stream through an alternate path. In addition to this, we develop an algorithm to analyze the information of the alarms generated by the components of an optical network, in the presence of a fault. It uses the alarm correlation in order to reduce the list of suspected components shown to the network operators. By our simulation results, we show that our proposed algorithms achieve less blocking probability and delay while getting higher throughput.
Keywords: Alarm correlation, blocking probability, delay, fault localization, WDM networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20741469 Microwave Sintering and Its Application on Cemented Carbides
Authors: Rumman Md Raihanuzzaman, Lee Chang Chuan, Zonghan Xie, Reza Ghomashchi
Abstract:
Cemented carbides, owing to their excellent mechanical properties, have been of immense interest in the field of hard materials for the past few decades. A number of processing techniques have been developed to obtain high quality carbide tools, with a wide range of grain size depending on the application and requirements. Microwave sintering is one of the heating processes, which has been used to prepare a wide range of materials including ceramics. A deep understanding of microwave sintering and its contribution towards control of grain growth and on deformation of the resulting carbide materials requires further studies and attention. In addition, the effect of binder materials and their behavior during microwave sintering is another area that requires clear understanding. This review aims to focus on microwave sintering, providing information of how the process works and what type of materials it is best suited for. In addition, a closer look at some microwave sintered Tungsten Carbide-Cobalt samples will be taken and discussed, highlighting some of the key issues and challenges faced in this research area.Keywords: Cemented carbides, consolidation, microwave sintering, mechanical properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29311468 An Intelligent System for Phish Detection, using Dynamic Analysis and Template Matching
Authors: Chinmay Soman, Hrishikesh Pathak, Vishal Shah, Aniket Padhye, Amey Inamdar
Abstract:
Phishing, or stealing of sensitive information on the web, has dealt a major blow to Internet Security in recent times. Most of the existing anti-phishing solutions fail to handle the fuzziness involved in phish detection, thus leading to a large number of false positives. This fuzziness is attributed to the use of highly flexible and at the same time, highly ambiguous HTML language. We introduce a new perspective against phishing, that tries to systematically prove, whether a given page is phished or not, using the corresponding original page as the basis of the comparison. It analyzes the layout of the pages under consideration to determine the percentage distortion between them, indicative of any form of malicious alteration. The system design represents an intelligent system, employing dynamic assessment which accurately identifies brand new phishing attacks and will prove effective in reducing the number of false positives. This framework could potentially be used as a knowledge base, in educating the internet users against phishing.Keywords: World Wide Web, Phishing, Internet security, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18401467 Role of Environmental Focus in Legal Protection and Efficient Management of Wetlands in the Republic of Kazakhstan
Authors: K. R. Balabiyev, A. O. Kaipbayeva
Abstract:
The article discusses the legal framework of the government’s environmental function and analyzes the role of the national policy in protection of wetlands. The problem is of interest for it deals with the most important branch of economy – utilization of Kazakhstan’s natural resources, protection of health and environmental wellbeing of the population. Development of a longterm environmental program addressing the protection of wetlands represents the final stage of the government’s environmental policy, and is a relatively new function for the public administration system. It appeared due to the environmental measures that require immediate decisions to be taken. It is an integral part of the effort in the field of management of state-owned natural resource, as well as of the measures aimed at efficient management of natural resources to avoid their early depletion or contamination.
Keywords: Environmental focus, government’s environmental function, protection of wetlands.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15721466 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition
Authors: L. Hamsaveni, Navya Prakash, Suresha
Abstract:
Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.Keywords: Grayscale image format, image fusing, SURF detection, YCbCr image format.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11611465 Model to Support Synchronous and Asynchronous in the Learning Process with An Adaptive Hypermedia System
Authors: Francisca Grimón, Marylin Giugni, Josep Monguet F., Joaquín Fernández, Luis León G.
Abstract:
In blended learning environments, the Internet can be combined with other technologies. The aim of this research was to design, introduce and validate a model to support synchronous and asynchronous activities by managing content domains in an Adaptive Hypermedia System (AHS). The application is based on information recovery techniques, clustering algorithms and adaptation rules to adjust the user's model to contents and objects of study. This system was applied to blended learning in higher education. The research strategy used was the case study method. Empirical studies were carried out on courses at two universities to validate the model. The results of this research show that the model had a positive effect on the learning process. The students indicated that the synchronous and asynchronous scenario is a good option, as it involves a combination of work with the lecturer and the AHS. In addition, they gave positive ratings to the system and stated that the contents were adapted to each user profile.
Keywords: Blended Learning, System Adaptive, Model, Clustering Algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18611464 Robust Camera Calibration using Discrete Optimization
Authors: Stephan Rupp, Matthias Elter, Michael Breitung, Walter Zink, Christian Küblbeck
Abstract:
Camera calibration is an indispensable step for augmented reality or image guided applications where quantitative information should be derived from the images. Usually, a camera calibration is obtained by taking images of a special calibration object and extracting the image coordinates of projected calibration marks enabling the calculation of the projection from the 3d world coordinates to the 2d image coordinates. Thus such a procedure exhibits typical steps, including feature point localization in the acquired images, camera model fitting, correction of distortion introduced by the optics and finally an optimization of the model-s parameters. In this paper we propose to extend this list by further step concerning the identification of the optimal subset of images yielding the smallest overall calibration error. For this, we present a Monte Carlo based algorithm along with a deterministic extension that automatically determines the images yielding an optimal calibration. Finally, we present results proving that the calibration can be significantly improved by automated image selection.Keywords: Camera Calibration, Discrete Optimization, Monte Carlo Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18211463 Advantages of Fuzzy Control Application in Fast and Sensitive Technological Processes
Authors: Radim Farana, Bogdan Walek, Michal Janosek, Jaroslav Zacek
Abstract:
This paper presents the advantages of fuzzy control use in technological processes control. The paper presents a real application of the Linguistic Fuzzy-Logic Control, developed at the University of Ostrava for the control of physical models in the Intelligent Systems Laboratory. The paper presents an example of a sensitive non-linear model, such as a magnetic levitation model and obtained results which show how modern information technologies can help to solve actual technical problems. A special method based on the LFLC controller with partial components is presented in this paper followed by the method of automatic context change, which is very helpful to achieve more accurate control results. The main advantage of the used system is its robustness in changing conditions demonstrated by comparing with conventional PID controller. This technology and real models are also used as a background for problem-oriented teaching, realized at the department for master students and their collaborative as well as individual final projects.Keywords: Control, fuzzy logic, sensitive system, technological proves.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18051462 Nuts Composition and their Health Benefits
Authors: S. Azadmard-Damirchi, Sh. Emami, J. Hesari, S.H. Peighambardoust, M. Nemati
Abstract:
Nuts are part of a healthy diet such as Mediterranean diet. Benefits of nuts in reducing the risk of heart disease has been reasonably attributed to their composition of vitamins, minerals, unsaturated fatty acids, fiber and phytochemicals such as polyphenols, tocopherols, squalene and phytosterols. More than 75% of total fatty acids of nuts are unsaturated. α- tocopherol is the main tocopherol isomer present in most of the nuts. While walnuts, Brazil nut, cashew nut, peanut, pecan and pistachio nuts are rich in γ- tocopherol. β- sitosterol is dominant sterol in nuts. Pistachio and pine nut have the highest total phytosterol and Brazil nut and English walnut the lowest. Walnuts also contain large amount of phenolic compounds compared with other nuts. Nuts are rich in compounds with antioxidant properties and their consumption can offer preventing from incidence of many diseases including cardiovascular.
Keywords: Nuts, phenols, phytosterols, squalene, vitamin E.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 56641461 Analytical Study of Applying the Account Aggregation Approach in E-Banking Services
Authors: A. Al Drees, A. Alahmari, R. Almuwayshir
Abstract:
The advanced information technology is becoming an important factor in the development of financial services industry, especially the banking industry. It has introduced new ways of delivering banking to the customer, such as Internet Banking. Banks began to look at electronic banking (e-banking) as a means to replace some of their traditional branch functions using the Internet as a new distribution channel. Some consumers have at least more than one account, and across banks, and access these accounts using e-banking services. To look at the current net worth position, customers have to login to each of their accounts and get the details and work on consolidation. This not only takes ample time but it is a repetitive activity at a specified frequency. To address this point, an account aggregation concept is added as a solution. E-banking account aggregation, as one of the e-banking types, appeared to build a stronger relationship with customers. Account Aggregation Service generally refers to a service that allows customers to manage their bank accounts maintained in different institutions through a common Internet banking operating a platform, with a high concern to security and privacy. This paper presents an overview of an e-banking account aggregation approach as a new service in the e-banking field.
Keywords: E-banking, security, account aggregation, enterprise application development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15611460 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference
Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov
Abstract:
Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.
Keywords: Canonical correlation analysis, power curve, power performance, wind energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10511459 Modeling the Symptom-Disease Relationship by Using Rough Set Theory and Formal Concept Analysis
Authors: Mert Bal, Hayri Sever, Oya Kalıpsız
Abstract:
Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Keywords: Formal Concept Analysis, Rough Set Theory, Granular Computing, Medical Decision Support System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18201458 Gender Differences in E-Society: The Case of Slovenia
Authors: Mitja Dečman
Abstract:
The ever-increasing presence and use of information and communication technology (ICT) influences the different social relationships of today's society. Gender differences are especially important from the viewpoint of modern society since ICT can either deepen the existing inequalities or diminish them. In a developed Western world, gender equality has been a well-focused area for decades in many parts of society including education, employment or politics and has led to a decrease in the inequality of women and men in these and other areas. The area of digital equality, or inequality for that matter, is one of the areas where gender differences still exist in many countries of the world. The research presented in this paper focuses on Slovenia, one of the smallest EU member states, being an average achiever in the area of e-society according to the many different European benchmarking indexes. On the other hand, Slovenia is working in an alignment with many European gender equality guidelines and showing good results. The results of our research are based on the analysis of survey data from 2014 to 2017 dealing with Slovenian citizens and their households and the use of ICT. Considering gender issues, the synthesis showed that cultural differences influence some measured ICT indicators but on the other hand the differences are low and only sometimes statistically significant.
Keywords: Digital divide, e-society, gender inequality, Slovenia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7971457 Exploiting Non Circularity for Angle Estimation in Bistatic MIMO Radar Systems
Authors: Ebregbe David, Deng Weibo
Abstract:
The traditional second order statistics approach of using only the hermitian covariance for non circular signals, does not take advantage of the information contained in the complementary covariance of these signals. Radar systems often use non circular signals such as Binary Phase Shift Keying (BPSK) signals. Their noncicular property can be exploited together with the dual centrosymmetry of the bistatic MIMO radar system to improve angle estimation performance. We construct an augmented matrix from the received data vectors using both the positive definite hermitian covariance matrix and the complementary covariance matrix. The Unitary ESPRIT technique is then applied to the signal subspace of the augmented covariance matrix for automatically paired Direction-of-arrival (DOA) and Direction-of-Departure (DOD) angle estimates. The number of targets that can be detected is twice that obtainable with the conventional ESPRIT approach. Simulation results show the effectiveness of this method in terms of increase in resolution and the number of targets that can be detected.
Keywords: Bistatic MIMO Radar, Unitary Esprit, Non circular signals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19241456 Development of Greenhouse Analysis Tools for Home Agriculture Project
Authors: M. Amir Abas, M. Dahlui
Abstract:
This paper presents the development of analysis tools for Home Agriculture project. The tools are required for monitoring the condition of greenhouse which involves two components: measurement hardware and data analysis engine. Measurement hardware is functioned to measure environment parameters such as temperature, humidity, air quality, dust and etc while analysis tool is used to analyse and interpret the integrated data against the condition of weather, quality of health, irradiance, quality of soil and etc. The current development of the tools is completed for off-line data recorded technique. The data is saved in MMC and transferred via ZigBee to Environment Data Manager (EDM) for data analysis. EDM converts the raw data and plot three combination graphs. It has been applied in monitoring three months data measurement for irradiance, temperature and humidity of the greenhouse..Keywords: Monitoring, Environment, Greenhouse, Analysis tools
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20301455 Ensuring Data Security and Consistency in FTIMA - A Fault Tolerant Infrastructure for Mobile Agents
Authors: Umar Manzoor, Kiran Ijaz, Wajiha Shamim, Arshad Ali Shahid
Abstract:
Transaction management is one of the most crucial requirements for enterprise application development which often require concurrent access to distributed data shared amongst multiple application / nodes. Transactions guarantee the consistency of data records when multiple users or processes perform concurrent operations. Existing Fault Tolerance Infrastructure for Mobile Agents (FTIMA) provides a fault tolerant behavior in distributed transactions and uses multi-agent system for distributed transaction and processing. In the existing FTIMA architecture, data flows through the network and contains personal, private or confidential information. In banking transactions a minor change in the transaction can cause a great loss to the user. In this paper we have modified FTIMA architecture to ensure that the user request reaches the destination server securely and without any change. We have used triple DES for encryption/ decryption and MD5 algorithm for validity of message.Keywords: Distributed Transaction, Security, Mobile Agents, FTIMA Architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15311454 Performance Analysis of Expert Systems Incorporating Neural Network for Fault Detection of an Electric Motor
Authors: M. Khatami Rad, N. Jamali, M. Torabizadeh, A. Noshadi
Abstract:
In this paper, an artificial neural network simulator is employed to carry out diagnosis and prognosis on electric motor as rotating machinery based on predictive maintenance. Vibration data of the primary failed motor including unbalance, misalignment and bearing fault were collected for training the neural network. Neural network training was performed for a variety of inputs and the motor condition was used as the expert training information. The main purpose of applying the neural network as an expert system was to detect the type of failure and applying preventive maintenance. The advantage of this study is for machinery Industries by providing appropriate maintenance that has an essential activity to keep the production process going at all processes in the machinery industry. Proper maintenance is pivotal in order to prevent the possible failures in operating system and increase the availability and effectiveness of a system by analyzing vibration monitoring and developing expert system.Keywords: Condition based monitoring, expert system, neural network, fault detection, vibration monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19971453 Rapid Study on Feature Extraction and Classification Models in Healthcare Applications
Authors: S. Sowmyayani
Abstract:
The advancement of computer-aided design helps the medical force and security force. Some applications include biometric recognition, elderly fall detection, face recognition, cancer recognition, tumor recognition, etc. This paper deals with different machine learning algorithms that are more generically used for any health care system. The most focused problems are classification and regression. With the rise of big data, machine learning has become particularly important for solving problems. Machine learning uses two types of techniques: supervised learning and unsupervised learning. The former trains a model on known input and output data and predicts future outputs. Classification and regression are supervised learning techniques. Unsupervised learning finds hidden patterns in input data. Clustering is one such unsupervised learning technique. The above-mentioned models are discussed briefly in this paper.
Keywords: Supervised learning, unsupervised learning, regression, neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3581452 Financial Statement Fraud: The Need for a Paradigm Shift to Forensic Accounting
Authors: Ifedapo Francis Awolowo
Abstract:
The unrelenting series of embarrassing audit failures should stimulate a paradigm shift in accounting. And in this age of information revolution, there is need for a constant improvement on the products or services one offers to the market in order to be relevant. This study explores the perceptions of external auditors, forensic accountants and accounting academics on whether a paradigm shift to forensic accounting can reduce financial statement frauds. Through Neo-empiricism/inductive analytical approach, findings reveal that a paradigm shift to forensic accounting might be the right step in the right direction in order to increase the chances of fraud prevention and detection in the financial statement. This research has implication on accounting education on the need to incorporate forensic accounting into present day accounting curriculum. Accounting professional bodies, accounting standard setters and accounting firms all have roles to play in incorporating forensic accounting education into accounting curriculum. Particularly, there is need to alter the ISA 240 to make the prevention and detection of frauds the responsibilities of bot those charged with the management and governance of companies and statutory auditors.Keywords: Financial statement fraud, forensic accounting, fraud prevention and detection, auditing, audit expectation gap, corporate governance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36501451 Study of Compaction in Hot-Mix Asphalt Using Computer Simulations
Authors: Kasthurirangan Gopalakrishnan, Naga Shashidhar, Xiaoxiong Zhong
Abstract:
During the process of compaction in Hot-Mix Asphalt (HMA) mixtures, the distance between aggregate particles decreases as they come together and eliminate air-voids. By measuring the inter-particle distances in a cut-section of a HMA sample the degree of compaction can be estimated. For this, a calibration curve is generated by computer simulation technique when the gradation and asphalt content of the HMA mixture are known. A two-dimensional cross section of HMA specimen was simulated using the mixture design information (gradation, asphalt content and air-void content). Nearest neighbor distance methods such as Delaunay triangulation were used to study the changes in inter-particle distance and area distribution during the process of compaction in HMA. Such computer simulations would enable making several hundreds of repetitions in a short period of time without the necessity to compact and analyze laboratory specimens in order to obtain good statistics on the parameters defined. The distributions for the statistical parameters based on computer simulations showed similar trends as those of laboratory specimens.Keywords: Computer simulations, Hot-Mix Asphalt (HMA), inter-particle distance, image analysis, nearest neighbor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19011450 Adaptive Network Intrusion Detection Learning: Attribute Selection and Classification
Authors: Dewan Md. Farid, Jerome Darmont, Nouria Harbi, Nguyen Huu Hoa, Mohammad Zahidur Rahman
Abstract:
In this paper, a new learning approach for network intrusion detection using naïve Bayesian classifier and ID3 algorithm is presented, which identifies effective attributes from the training dataset, calculates the conditional probabilities for the best attribute values, and then correctly classifies all the examples of training and testing dataset. Most of the current intrusion detection datasets are dynamic, complex and contain large number of attributes. Some of the attributes may be redundant or contribute little for detection making. It has been successfully tested that significant attribute selection is important to design a real world intrusion detection systems (IDS). The purpose of this study is to identify effective attributes from the training dataset to build a classifier for network intrusion detection using data mining algorithms. The experimental results on KDD99 benchmark intrusion detection dataset demonstrate that this new approach achieves high classification rates and reduce false positives using limited computational resources.Keywords: Attributes selection, Conditional probabilities, information gain, network intrusion detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27041449 An Evaluation Method of Accelerated Storage Life Test for Typical Mechanical and Electronic Products
Authors: Jinyong Yao, Hongzhi Li, Chao Du, Jiao Li
Abstract:
Reliability of long-term storage products is related to the availability of the whole system, and the evaluation of storage life is of great necessity. These products are usually highly reliable and little failure information can be collected. In this paper, an analytical method based on data from accelerated storage life test is proposed to evaluate the reliability index of the long-term storage products. Firstly, singularities are eliminated by data normalization and residual analysis. Secondly, with the preprocessed data, the degradation path model is built to obtain the pseudo life values. Then by life distribution hypothesis, we can get the estimator of parameters in high stress levels and verify failure mechanism consistency. Finally, the life distribution under the normal stress level is extrapolated via the acceleration model and evaluation of the actual average life is available. An application example with the camera stabilization device is provided to illustrate the methodology we proposed.
Keywords: Accelerated storage life test, failure mechanism consistency, life distribution, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23011448 Preemptive Possibilistic Linear Programming:Application to Aggregate Production Planning
Authors: Phruksaphanrat B.
Abstract:
This research proposes a Preemptive Possibilistic Linear Programming (PPLP) approach for solving multiobjective Aggregate Production Planning (APP) problem with interval demand and imprecise unit price and related operating costs. The proposed approach attempts to maximize profit and minimize changes of workforce. It transforms the total profit objective that has imprecise information to three crisp objective functions, which are maximizing the most possible value of profit, minimizing the risk of obtaining the lower profit and maximizing the opportunity of obtaining the higher profit. The change of workforce level objective is also converted. Then, the problem is solved according to objective priorities. It is easier than simultaneously solve the multiobjective problem as performed in existing approach. Possible range of interval demand is also used to increase flexibility of obtaining the better production plan. A practical application of an electronic company is illustrated to show the effectiveness of the proposed model.Keywords: Aggregate production planning, Fuzzy sets theory, Possibilistic linear programming, Preemptive priority
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18701447 Didactic Material Resources in the Teaching of National History and Geography: Selected Results of a Qualitative Survey
Authors: Martin Skutil, Klára Havlíčková, Renata Matějíčková
Abstract:
The paper is the first output of a larger research project conducted at the Faculty of Education of the University of Hradec Králové, which deals with an improved understanding of teachers' work in the subject of National History and Geography. Partial findings focusing on the use of didactic material resources in teaching are presented in this phase. With the regard to promotion of independent activity of students within learner based education, material equipment of schools with didactic aids is becoming increasingly important. This paper is based on qualitative research, where the possibilities and mainly the reasons for use of material didactic resources in teaching were investigated through semi-structured interviews. Attention was focused on ways of working with different teaching aids and their implementation in the educational process. It turns out that teachers accept current constructivist and humanistic approaches to education associated with the requirement to prepare students for life in an information society, and accordingly they adjust their teaching.
Keywords: Primary education, National History and Geography, didactic material resources, qualitative research.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36601446 Multidimensional Compromise Programming Evaluation of Digital Commerce Websites
Authors: C. Ardil
Abstract:
Multidimensional compromise programming evaluation of digital commerce websites is essential not only to have recommendations for improvement, but also to make comparisons with global business competitors. This research provides a multidimensional decision making model that prioritizes the objective criteria weights of various commerce websites using multidimensional compromise solution. Evaluation of digital commerce website quality can be considered as a complex information system structure including qualitative and quantitative factors for a multicriteria decision making problem. The proposed multicriteria decision making approach mainly consists of three sequential steps for the selection problem. In the first step, three major different evaluation criteria are characterized for website ranking problem. In the second step, identified critical criteria are weighted using the standard deviation procedure. In the third step, the multidimensional compromise programming is applied to rank the digital commerce websites.
Keywords: Standard deviation, commerce website, website evaluation, multicriteria decision making, multicriteria compromise programming, website quality, multidimensional decision analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8231445 Response Quality Evaluation in Heterogeneous Question Answering System: A Black-box Approach
Authors: Goh Ong Sing, C. Ardil, Wilson Wong, Shahrin Sahib
Abstract:
The evaluation of the question answering system is a major research area that needs much attention. Before the rise of domain-oriented question answering systems based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when question answering systems began to be more domains specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time achieve higher quality responses The research in this paper discusses the inappropriateness of the existing measure for response quality evaluation and in a later part, the call for new standard measures and the related considerations are brought forward. As a short-term solution for evaluating response quality of heterogeneous systems, and to demonstrate the challenges in evaluating systems of different nature, this research presents a black-box approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems (i.e. AnswerBus, START and NaLURI).
Keywords: Evaluation, question answering, response quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867