Search results for: Mutual Information based Least dependent Component Analysis(MILCA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14475

Search results for: Mutual Information based Least dependent Component Analysis(MILCA)

13815 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669
13814 A Methodology to Analyze Technology Convergence: Patent-Citation Based Technology Input-Output Analysis

Authors: Jeeeun Kim, Sungjoo Lee

Abstract:

This research proposes a methodology for patent-citation-based technology input-output analysis by applying the patent information to input-output analysis developed for the dependencies among different industries. For this analysis, a technology relationship matrix and its components, as well as input and technology inducement coefficients, are constructed using patent information. Then, a technology inducement coefficient is calculated by normalizing the degree of citation from certain IPCs to the different IPCs (International patent classification) or to the same IPCs. Finally, we construct a Dependency Structure Matrix (DSM) based on the technology inducement coefficient to suggest a useful application for this methodology.

Keywords: Technology spillover effect, technology relationship, IO table, technology inducement coefficients, patent analysis, patent citation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2568
13813 A High Bitrate Information Hiding Algorithm for Video in Video

Authors: Wang Shou-Dao, Xiao Chuang-Bai, Lin Yu

Abstract:

In high bitrate information hiding techniques, 1 bit is embedded within each 4 x 4 Discrete Cosine Transform (DCT) coefficient block by means of vector quantization, then the hidden bit can be effectively extracted in terminal end. In this paper high bitrate information hiding algorithms are summarized, and the scheme of video in video is implemented. Experimental result shows that the host video which is embedded numerous auxiliary information have little visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host video only degrades 0.22dB in average, while the hidden information has a high percentage of survives and keeps a high robustness in H.264/AVC compression, the average Bit Error Rate(BER) of hiding information is 0.015%.

Keywords: Information Hiding, Embed, Quantification, Extract

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894
13812 Product-Based Industrial Information Systems (Application to the Steel Industry)

Authors: Daniel F. Garcia, Diego Gonzalez

Abstract:

This paper shows a simple and effective approach to the design and implementation of Industrial Information Systems (IIS) oriented to control the characteristics of each individual product manufactured in a production line and also their manufacturing conditions. The particular products considered in this work are large steel strips that are coiled just after their manufacturing. However, the approach is directly applicable to coiled strips in other industries, like paper, textile, aluminum, etc. These IIS provide very detailed information of each manufactured product, which complement the general information managed by the ERP system of the production line. In spite of the high importance of this type of IIS to guarantee and improve the quality of the products manufactured in many industries, there are very few works about them in the technical literature. For this reason, this paper represents an important contribution to the development of this type of IIS, providing guidelines for their design, implementation and exploitation.

Keywords: Data storage, industrial information systems, measurement systems integration, signal acquisition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430
13811 Intention to Use Digital Library based on Modified UTAUT Model: Perspectives of Malaysian Postgraduate Students

Authors: Abd Latif Abdul Rahman, Adnan Jamaludin, Zamalia Mahmud

Abstract:

Unified Theory of Acceptance and Use of Technology (UTAUT) model has demonstrated the influencing factors for generic information systems use such as tablet personal computer (TPC) and mobile communication. However, in the context of digital library system, there has been very little effort to determine factors affecting the intention to use digital library based on the UTAUT model. This paper investigates factors that are expected to influence the intention of postgraduate students to use digital library based on modified UTAUT model. The modified model comprises of constructs represented by several latent variables, namely performance expectancy (PE), effort expectancy (EE), information quality (IQ) and service quality (SQ) and moderated by age, gender and experience in using digital library. Results show that performance expectancy, effort expectancy and information quality are positively related to the intention to use digital library, while service quality is negatively related to the intention to use digital library. Age and gender have shown no evidence of any significant interactions, while experience in using digital library significantly interacts with effort expectancy and intention to use digital library. This has provided the evidence of a moderating effect of experience in the intention to use digital library. It is expected that this research will shed new lights into research of acceptance and intention to use the library in a digital environment.

Keywords: Intention to use digital library, UTAUT model, performance expectancy, effort expectancy, information quality, service quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4862
13810 ISCS (Information Security Check Service) for the Safety and Reliability of Communications

Authors: Jong-Whoi Shin, Jin-Tae Lee, Sang-Soo Jang, Jae-II Lee

Abstract:

Recent widespread use of information and communication technology has greatly changed information security risks that businesses and institutions encounter. Along with this situation, in order to ensure security and have confidence in electronic trading, it has become important for organizations to take competent information security measures to provide international confidence that sensitive information is secure. Against this backdrop, the approach to information security checking has come to an important issue, which is believed to be common to all countries. The purpose of this paper is to introduce the new system of information security checking program in Korea and to propose synthetic information security countermeasures under domestic circumstances in order to protect physical equipment, security management and technology, and the operation of security check for securing services on ISP(Internet Service Provider), IDC(Internet Data Center), and e-commerce(shopping malls, etc.)

Keywords: Information Security Check Service, safety criteria, object enterpriser.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
13809 Acoustic Source Localization Based On the Extended Kalman Filter for an Underwater Vehicle with a Pair of Hydrophones

Authors: ByungHoon Kang, Jeawook Shin, Ju-man Song, Hyun-Taek Choi, PooGyeon Park

Abstract:

In this study, we consider a special situation that only a pair of hydrophone on a moving underwater vehicle is available to localize a fixed acoustic source of far distance. The trigonometry can be used in this situation by using two different DOA of different locations. Notice that the distance between the two locations should be measured. Therefore, we assume that the vehicle is sailing straightly and the moving distance for each unit time is measured continuously. However, the accuracy of the localization using the trigonometry is highly dependent to the accuracy of DOAs and measured moving distances. Therefore, we proposed another method based on the extended Kalman filter that gives more robust and accurate localization result.

Keywords: Localization, acoustic, underwater, extended Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2188
13808 Research on Landscape Pattern Revolution of Land Use in Fuxian Lake Basin Based on RS and GIS

Authors: Jing Zhou, Li Wu

Abstract:

Based on the remote image data of land use in the four periods of 1980, 1995, 2005 and 2015, this study quantitatively analyzed the dynamic variation of landscape transfer and landscape pattern in the Fuxian Lake basin by constructing a land use dynamic variation model and using ArcGIS 10.5 and Fragstats 4.2. The results indicate that: (1) From the perspective of land use landscape transfer, the intensity of land use is slowly rising from 1980 to 2015, and the main reduction landscape type is farmland and its net amount of transfer-out is the most among all transfer-outs, which is to 788.85 hm2, the main added landscape type is construction land and its net amount of transfer-in is the most, which is to 475.23 hm2. Meanwhile, the land use landscape variation in the stage of 2005-2015 showed the most severe among three periods when compared with other two stages. (2) From the perspective of land use landscape variation, significant spatial differences are shown, the changes in the north of the basin are significantly higher than that in the south, the west coast are apparently higher than the east. (3) From the perspective of landscape pattern index, the number of plaques is on the increase in the periods of 35 years in the basin, and there is little mutual interference between landscape patterns because the plaques are relatively discrete. Cultivated land showed a trend of fragmentation but constructive land showed trend of relative concentration. The sustainable development and biodiversity in this basin are under threat for the fragmented landscape pattern and the poorer connectivity.

Keywords: Land use, landscape pattern evolution, landscape pattern index, Fuxian Lake basin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 574
13807 The Role of Online Videos in Undergraduate Casual-Leisure Information Behaviors

Authors: Nei-Ching Yeh

Abstract:

This study describes undergraduate casual-leisure information behaviors relevant to online videos. Diaries and in-depth interviews were used to collect data. Twenty-four undergraduates participated in this study (9 men, 15 women; all were aged 18–22 years). This study presents a model of casual-leisure information behaviors and contributes new insights into user experience in casual-leisure settings, such as online video programs, with implications for other information domains.

Keywords: Casual-leisure information behaviors, information behavior, online videos, role.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1182
13806 Selective Solvent Extraction of Calcium and Magnesium from Concentrate Nickel Solutions Using Mixtures of Cyanex 272 and D2EHPA

Authors: Alexandre S. Guimarães, Marcelo B. Mansur

Abstract:

The performance of organophosphorus extractants Cyanex 272 and D2EHPA on the purification of concentrate nickel sulfate solutions was evaluated. Batch scale tests were carried out at pH range of 2 to 7 using a laboratory solution simulating concentrate nickel liquors as those typically obtained when sulfate intermediates from nickel laterite are re-leached and treated for the selective removal of cobalt, zinc, manganese and copper with Cyanex 272 ([Ca] = 0.57 g/L, [Mg] = 3.2 g/L, and [Ni] = 88 g/L). The increase on the concentration of D2EHPA favored the calcium extraction. The extraction of magnesium is dependent on the pH and of ratio of extractants D2EHPA and Cyanex 272 in the organic phase. The composition of the investigated organic phase did not affect nickel extraction. The number of stages is dependent on the magnesium extraction. The most favorable operating condition to selectively remove calcium and magnesium was determined.

Keywords: Solvent extraction, organophosphorus extractants, alkaline earth metals, nickel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2465
13805 iCCS: Development of a Mobile Web-Based Student Integrated Information System Using Hill Climbing Algorithm

Authors: Maria Cecilia G. Cantos, Lorena W. Rabago, Bartolome T. Tanguilig III

Abstract:

This paper describes a conducive and structured information exchange environment for the students of the College of Computer Studies in Manuel S. Enverga University Foundation in. The system was developed to help the students to check their academic result, manage profile, make self-enlistment and assist the students to manage their academic status that can be viewed also in mobile phones. Developing class schedules in a traditional way is a long process that involves making many numbers of choices. With Hill Climbing Algorithm, however, the process of class scheduling, particularly with regards to courses to be taken by the student aligned with the curriculum, can perform these processes and end up with an optimum solution. The proponent used Rapid Application Development (RAD) for the system development method. The proponent also used the PHP as the programming language and MySQL as the database.

Keywords: Hill climbing algorithm, integrated system, mobile web-based, student information system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3716
13804 An Improved C-Means Model for MRI Segmentation

Authors: Ying Shen, Weihua Zhu

Abstract:

Medical images are important to help identifying different diseases, for example, Magnetic resonance imaging (MRI) can be used to investigate the brain, spinal cord, bones, joints, breasts, blood vessels, and heart. Image segmentation, in medical image analysis, is usually the first step to find out some characteristics with similar color, intensity or texture so that the diagnosis could be further carried out based on these features. This paper introduces an improved C-means model to segment the MRI images. The model is based on information entropy to evaluate the segmentation results by achieving global optimization. Several contributions are significant. Firstly, Genetic Algorithm (GA) is used for achieving global optimization in this model where fuzzy C-means clustering algorithm (FCMA) is not capable of doing that. Secondly, the information entropy after segmentation is used for measuring the effectiveness of MRI image processing. Experimental results show the outperformance of the proposed model by comparing with traditional approaches.

Keywords: Magnetic Resonance Image, C-means model, image segmentation, information entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911
13803 Embedding a Large Amount of Information Using High Secure Neural Based Steganography Algorithm

Authors: Nameer N. EL-Emam

Abstract:

In this paper, we construct and implement a new Steganography algorithm based on learning system to hide a large amount of information into color BMP image. We have used adaptive image filtering and adaptive non-uniform image segmentation with bits replacement on the appropriate pixels. These pixels are selected randomly rather than sequentially by using new concept defined by main cases with sub cases for each byte in one pixel. According to the steps of design, we have been concluded 16 main cases with their sub cases that covere all aspects of the input information into color bitmap image. High security layers have been proposed through four layers of security to make it difficult to break the encryption of the input information and confuse steganalysis too. Learning system has been introduces at the fourth layer of security through neural network. This layer is used to increase the difficulties of the statistical attacks. Our results against statistical and visual attacks are discussed before and after using the learning system and we make comparison with the previous Steganography algorithm. We show that our algorithm can embed efficiently a large amount of information that has been reached to 75% of the image size (replace 18 bits for each pixel as a maximum) with high quality of the output.

Keywords: Adaptive image segmentation, hiding with high capacity, hiding with high security, neural networks, Steganography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984
13802 Expert System for Sintering Process Control based on the Information about solid-fuel Flow Composition

Authors: Yendiyarov Sergei, Zobnin Boris, Petrushenko Sergei

Abstract:

Usually, the solid-fuel flow of an iron ore sinter plant consists of different types of the solid-fuels, which differ from each other. Information about the composition of the solid-fuel flow usually comes every 8-24 hours. It can be clearly seen that this information cannot be used to control the sintering process in real time. Due to this, we propose an expert system which uses indirect measurements from the process in order to obtain the composition of the solid-fuel flow by solving an optimization task. Then this information can be used to control the sintering process. The proposed technique can be successfully used to improve sinter quality and reduce the amount of solid-fuel used by the process.

Keywords: sintering process, particle swarm optimization, optimal control, expert system, solid-fuel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1942
13801 The Study on Mechanical Properties of Graphene Using Molecular Mechanics

Authors: I-Ling Chang, Jer-An Chen

Abstract:

The elastic properties and fracture of two-dimensional graphene were calculated purely from the atomic bonding (stretching and bending) based on molecular mechanics method. Considering the representative unit cell of graphene under various loading conditions, the deformations of carbon bonds and the variations of the interlayer distance could be realized numerically under the geometry constraints and minimum energy assumption. In elastic region, it was found that graphene was in-plane isotropic. Meanwhile, the in-plane deformation of the representative unit cell is not uniform along armchair direction due to the discrete and non-uniform distributions of the atoms. The fracture of graphene could be predicted using fracture criteria based on the critical bond length, over which the bond would break. It was noticed that the fracture behavior were directional dependent, which was consistent with molecular dynamics simulation results.

Keywords: Energy minimization, fracture, graphene, molecular mechanics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1855
13800 Evaluation of Urban Development Proposals An ANP Approach

Authors: T. Gómez-Navarro, M. García-Melón, D. Díaz-Martín, S. Acuna-Dutra,

Abstract:

In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.

Keywords: Environmental pressure indicators, multicriteria decision analysis, analytic network process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
13799 CBCTL: A Reasoning System of TemporalEpistemic Logic with Communication Channel

Authors: Suguru Yoshioka, Satoshi Tojo

Abstract:

This paper introduces a temporal epistemic logic CBCTL that updates agent-s belief states through communications in them, based on computational tree logic (CTL). In practical environments, communication channels between agents may not be secure, and in bad cases agents might suffer blackouts. In this study, we provide inform* protocol based on ACL of FIPA, and declare the presence of secure channels between two agents, dependent on time. Thus, the belief state of each agent is updated along with the progress of time. We show a prover, that is a reasoning system for a given formula in a given a situation of an agent ; if it is directly provable or if it could be validated through the chains of communications, the system returns the proof.

Keywords: communication channel, computational tree logic, reasoning system, temporal epistemic logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1242
13798 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks

Authors: K. Indra Gandhi

Abstract:

Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.

Keywords: Model-driven development, wireless sensor networks, data acquisition, separation of concern, layered design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 951
13797 A Secure Auditing Framework for Load Balancing in Cloud Environment

Authors: R. Geetha, T. Padmavathy

Abstract:

Security audit is an important aspect or feature to be considered in cloud service customer. It is basically a certification process to audit the controls that deliver the security requirements. Security audits are conducted by trained and qualified staffs that belong to an independent auditing organization. Security audits must be carried as a standard of security controls. Proper check to be made that the cloud user has a proper reporting and logging facilities with the customer's system and hence ensuring appropriate business and operational flow of data through cloud service. We propose a cloud-based secure auditing framework, which enables confided in power to safely store their mystery information on the semi-believed cloud specialist co-ops, and specifically share their mystery information with a wide scope of information recipient, to diminish the key administration intricacy for power proprietors and information collectors. Unique in relation to past cloud-based information framework, data proprietors transfer their mystery information into cloud utilizing static and dynamic evaluating plan. Another propelled determination is, if any information beneficiary needs individual record to download, the information collector will send the solicitation to the expert. The specialist proprietor has the Access Control. At the off probability, the businessman must impart the primary record to the knowledge collector, acknowledge statistics beneficiary solicitation. Once the acknowledgement for the records is over, the recipient downloads the first record and this record shifting time with date and downloading time with date are monitored by the inspector. In addition to deduplication concept, diminished cloud memory area using dynamic document distribution has been proposed.

Keywords: Cloud computing, cloud storage auditing, data integrity, key exposure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1158
13796 Performance Analysis of MUSIC, Root-MUSIC and ESPRIT DOA Estimation Algorithm

Authors: N. P. Waweru, D. B. O. Konditi, P. K. Langat

Abstract:

Direction of Arrival estimation refers to defining a mathematical function called a pseudospectrum that gives an indication of the angle a signal is impinging on the antenna array. This estimation is an efficient method of improving the quality of service in a communication system by focusing the reception and transmission only in the estimated direction thereby increasing fidelity with a provision to suppress interferers. This improvement is largely dependent on the performance of the algorithm employed in the estimation. Many DOA algorithms exists amongst which are MUSIC, Root-MUSIC and ESPRIT. In this paper, performance of these three algorithms is analyzed in terms of complexity, accuracy as assessed and characterized by the CRLB and memory requirements in various environments and array sizes. It is found that the three algorithms are high resolution and dependent on the operating environment and the array size. 

Keywords: Direction of Arrival, Autocorrelation matrix, Eigenvalue decomposition, MUSIC, ESPRIT, CRLB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8734
13795 Component Based Framework for Authoring and Multimedia Training in Mathematics

Authors: Ion Smeureanu, Marian Dardala, Adriana Reveiu

Abstract:

The new programming technologies allow for the creation of components which can be automatically or manually assembled to reach a new experience in knowledge understanding and mastering or in getting skills for a specific knowledge area. The project proposes an interactive framework that permits the creation, combination and utilization of components that are specific to mathematical training in high schools. The main framework-s objectives are: • authoring lessons by the teacher or the students; all they need are simple operating skills for Equation Editor (or something similar, or Latex); the rest are just drag & drop operations, inserting data into a grid, or navigating through menus • allowing sonorous presentations of mathematical texts and solving hints (easier understood by the students) • offering graphical representations of a mathematical function edited in Equation • storing of learning objects in a database • storing of predefined lessons (efficient for expressions and commands, the rest being calculations; allows a high compression) • viewing and/or modifying predefined lessons, according to the curricula The whole thing is focused on a mathematical expressions minicompiler, storing the code that will be later used for different purposes (tables, graphics, and optimisations). Programming technologies used. A Visual C# .NET implementation is proposed. New and innovative digital learning objects for mathematics will be developed; they are capable to interpret, contextualize and react depending on the architecture where they are assembled.

Keywords: Adaptor, automatic assembly learning component and user control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
13794 Collective Redress in Consumer Protection in South East Europe: Cross-National Comparisons, Issues of Commonality and Difference

Authors: Veronika Efremova

Abstract:

In recent decades, there have been significant developments in the European Union in the field of collective consumer redress. South East European countries (SEE) covered by this paper, in line with their EU accession priorities and duties under Stabilisation and Association Agreements, have to harmonize their national laws with the relevant EU acquis for consumer protection (Chapter 28: Health and Consumer). In these countries, only minimal compliance is achieved. SEE countries have introduced rudimentary collective redress mechanisms, with modest enforcement of collective redress and case law. This paper is based on comprehensive interdisciplinary research conducted for SEE countries on common principles for injunctive and compensatory collective redress mechanisms, emphasizing cross-national comparisons, underlining issues of commonality and difference aiming to develop recommendations for an adequate enforcement of collective redress. SEE countries are recognized by the sectoral approach for regulating collective redress contrary to the majority of EU Member States with having adopted horizontal approach to collective redress. In most SEE countries, the laws do not recognize compensatory but only injunctive collective redress in consumer protection. All responsible stakeholders for implementation of collective redress in SEE countries, lack information and awareness on collective redress mechanisms and the way they function in practice. Therefore, specific actions are needed in these countries to make the whole system of collective redress for consumer protection operational and efficient. Taking into consideration the various designated stakeholders in collective redress in each SEE countries, there is a need of their mutual coordination and cooperation in order to develop consumer protection system and policies. By putting into practice the national collective redress mechanisms, effective access to justice for all consumers, the principle of rule of law will be secured and appropriate procedural guarantees to avoid abusive litigation will be ensured.

Keywords: Collective redress mechanism, consumer protection, commonality and difference, South East Europe.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 932
13793 Structure of Covering-based Rough Sets

Authors: Shiping Wang, Peiyong Zhu, William Zhu

Abstract:

Rough set theory is a very effective tool to deal with granularity and vagueness in information systems. Covering-based rough set theory is an extension of classical rough set theory. In this paper, firstly we present the characteristics of the reducible element and the minimal description covering-based rough sets through downsets. Then we establish lattices and topological spaces in coveringbased rough sets through down-sets and up-sets. In this way, one can investigate covering-based rough sets from algebraic and topological points of view.

Keywords: Covering, poset, down-set, lattice, topological space, topological base.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1843
13792 Financial Information and Collective Bargaining: Conflicting or Complementing?

Authors: Humayun Murshed, Shibly Abdullah

Abstract:

The research conducted in early seventies apparently assumed the existence of a universal decision model for union negotiators and furthermore tended to regard financial information as a ‘neutral’ input into a rational decision making process. However, research in the eighties began to question the neutrality of financial information as an input in collective bargaining rather viewing it as a potentially effective means for controlling the labour force. Furthermore, this later research also started challenging the simplistic assumptions relating particularly to union objectives which have underpinned the earlier search for universal union decision models. Despite the above developments there seems to be a dearth of studies in developing countries concerning the use of financial information in collective bargaining. This paper seeks to begin to remedy this deficiency. Utilising a case study approach based on two enterprises, one in the public sector and the other a multinational, the universal decision model is rejected and it is argued that the decision whether or not to use financial information is a contingent one and such a contingency is largely defined by the context and environment in which both union and management negotiators work. An attempt is also made to identify the factors constraining as well as promoting the use of financial information in collective bargaining, these being regarded as unique to the organisations within which the case studies are conducted.

Keywords: Collective Bargaining, Developing Countries, Disclosures, Financial Information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578
13791 The Optimal Equilibrium Capacity of Information Hiding Based on Game Theory

Authors: Ziquan Hu, Kun She, Shahzad Ali, Kai Yan

Abstract:

Game theory could be used to analyze the conflicted issues in the field of information hiding. In this paper, 2-phase game can be used to build the embedder-attacker system to analyze the limits of hiding capacity of embedding algorithms: the embedder minimizes the expected damage and the attacker maximizes it. In the system, the embedder first consumes its resource to build embedded units (EU) and insert the secret information into EU. Then the attacker distributes its resource evenly to the attacked EU. The expected equilibrium damage, which is maximum damage in value from the point of view of the attacker and minimum from the embedder against the attacker, is evaluated by the case when the attacker attacks a subset from all the EU. Furthermore, the optimal equilibrium capacity of hiding information is calculated through the optimal number of EU with the embedded secret information. Finally, illustrative examples of the optimal equilibrium capacity are presented.

Keywords: 2-Phase Game, Expected Equilibrium damage, InformationHiding, Optimal Equilibrium Capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
13790 A Context-Sensitive Algorithm for Media Similarity Search

Authors: Guang-Ho Cha

Abstract:

This paper presents a context-sensitive media similarity search algorithm. One of the central problems regarding media search is the semantic gap between the low-level features computed automatically from media data and the human interpretation of them. This is because the notion of similarity is usually based on high-level abstraction but the low-level features do not sometimes reflect the human perception. Many media search algorithms have used the Minkowski metric to measure similarity between image pairs. However those functions cannot adequately capture the aspects of the characteristics of the human visual system as well as the nonlinear relationships in contextual information given by images in a collection. Our search algorithm tackles this problem by employing a similarity measure and a ranking strategy that reflect the nonlinearity of human perception and contextual information in a dataset. Similarity search in an image database based on this contextual information shows encouraging experimental results.

Keywords: Context-sensitive search, image search, media search, similarity ranking, similarity search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 636
13789 DHCP Message Authentication with an Effective Key Management

Authors: HongIl Ju, JongWook Han

Abstract:

In this paper we describes the authentication for DHCP (Dynamic Host Configuration Protocol) message which provides the efficient key management and reduces the danger replay attack without an additional packet for a replay attack. And the authentication for DHCP message supports mutual authentication and provides both entity authentication and message authentication. We applied the authentication for DHCP message to the home network environments and tested through a home gateway.

Keywords: DHCP, authentication, key management, replayattack, home network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479
13788 Knowledge Based Wear Particle Analysis

Authors: Mohammad S. Laghari, Qurban A. Memon, Gulzar A. Khuwaja

Abstract:

The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.

Keywords: Computer vision, knowledge based systems, morphology, wear particles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
13787 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper

Abstract:

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Keywords: Additive manufacturing, lean production, reproducibility, work safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 838
13786 Improve of Evaluation Method for Information Security Levels of CIIP (Critical Information Infrastructure Protection)

Authors: Dong-Young Yoo, Jong-Whoi Shin, Gang Shin Lee, Jae-Il Lee

Abstract:

As the disfunctions of the information society and social development progress, intrusion problems such as malicious replies, spam mail, private information leakage, phishing, and pharming, and side effects such as the spread of unwholesome information and privacy invasion are becoming serious social problems. Illegal access to information is also becoming a problem as the exchange and sharing of information increases on the basis of the extension of the communication network. On the other hand, as the communication network has been constructed as an international, global system, the legal response against invasion and cyber-attack from abroad is facing its limit. In addition, in an environment where the important infrastructures are managed and controlled on the basis of the information communication network, such problems pose a threat to national security. Countermeasures to such threats are developed and implemented on a yearly basis to protect the major infrastructures of information communication. As a part of such measures, we have developed a methodology for assessing the information protection level which can be used to establish the quantitative object setting method required for the improvement of the information protection level.

Keywords: Information Security Evaluation Methodology, Critical Information Infrastructure Protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656