Search results for: malware information sharing platform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4541

Search results for: malware information sharing platform

2561 Online Partial Discharge Source Localization and Characterization Using Non-Conventional Method

Authors: Ammar Anwar Khan, Nissar R. Wani, Nazar Malik, Abdulrehman Al-Arainy, and Saad Alghuwainem

Abstract:

Power cables are vulnerable to failure due to aging or defects that occur with the passage of time under continuous operation and loading stresses. PD detection and characterization provide information on the location, nature, form and extent of the degradation. As a result, PD monitoring has become an important part of condition based maintenance (CBM) program among power utilities. Online partial discharge (PD) localization of defect sources in power cable system is possible using the time of flight method. The information regarding the time difference between the main and reflected pulses and cable length can help in locating the partial discharge source along the cable length. However, if the length of the cable is not known and the defect source is located at the extreme ends of the cable or in the middle of the cable, then double ended measurement is required to indicate the location of PD source. Use of multiple sensors can also help in discriminating the cable PD or local/ external PD. This paper presents the experience and results from online partial discharge measurements conducted in the laboratory and the challenges in partial discharge source localization.

Keywords: Power cables, partial discharge localization, HFCT, condition based monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2800
2560 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance

Authors: Sokkhey Phauk, Takeo Okazaki

Abstract:

The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.

Keywords: Academic performance prediction system, prediction model, educational data mining, dominant factors, feature selection methods, student performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 935
2559 Speaker Identification Using Admissible Wavelet Packet Based Decomposition

Authors: Mangesh S. Deshpande, Raghunath S. Holambe

Abstract:

Mel Frequency Cepstral Coefficient (MFCC) features are widely used as acoustic features for speech recognition as well as speaker recognition. In MFCC feature representation, the Mel frequency scale is used to get a high resolution in low frequency region, and a low resolution in high frequency region. This kind of processing is good for obtaining stable phonetic information, but not suitable for speaker features that are located in high frequency regions. The speaker individual information, which is non-uniformly distributed in the high frequencies, is equally important for speaker recognition. Based on this fact we proposed an admissible wavelet packet based filter structure for speaker identification. Multiresolution capabilities of wavelet packet transform are used to derive the new features. The proposed scheme differs from previous wavelet based works, mainly in designing the filter structure. Unlike others, the proposed filter structure does not follow Mel scale. The closed-set speaker identification experiments performed on the TIMIT database shows improved identification performance compared to other commonly used Mel scale based filter structures using wavelets.

Keywords: Speaker identification, Wavelet transform, Feature extraction, MFCC, GMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
2558 Fuzzy C-Means Clustering for Biomedical Documents Using Ontology Based Indexing and Semantic Annotation

Authors: S. Logeswari, K. Premalatha

Abstract:

Search is the most obvious application of information retrieval. The variety of widely obtainable biomedical data is enormous and is expanding fast. This expansion makes the existing techniques are not enough to extract the most interesting patterns from the collection as per the user requirement. Recent researches are concentrating more on semantic based searching than the traditional term based searches. Algorithms for semantic searches are implemented based on the relations exist between the words of the documents. Ontologies are used as domain knowledge for identifying the semantic relations as well as to structure the data for effective information retrieval. Annotation of data with concepts of ontology is one of the wide-ranging practices for clustering the documents. In this paper, indexing based on concept and annotation are proposed for clustering the biomedical documents. Fuzzy c-means (FCM) clustering algorithm is used to cluster the documents. The performances of the proposed methods are analyzed with traditional term based clustering for PubMed articles in five different diseases communities. The experimental results show that the proposed methods outperform the term based fuzzy clustering.

Keywords: MeSH Ontology, Concept Indexing, Annotation, semantic relations, Fuzzy c-means.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281
2557 Types of Epilepsies and Findings EEG- LORETA about Epilepsy

Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi

Abstract:

Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review the findings EEG- LORETA about epilepsy.

Keywords: Epilepsy, EEG, EEG- Loreta, loreta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3062
2556 Sustainability Assessment of Agriculture and Biodiversity Issues through an Innovative Knowledge Mediation System Using Deliberation Support Tools and INTEGRAAL Method Based on Stakeholder Involvement

Authors: Ashiquer Rahman

Abstract:

The cutting edge knowledge mediation system called ‘ePLANETe’ provides a framework for building knowledge, tools, and methods for education, research, and sustainable practices, as well as the deliberative assessment support for Higher Education, Research Institutions, and elsewhere e.g., the collaborative learning and research on sustainability and biodiversity issues of territorial development sectors. The paper is to present the analytical perspective of the ‘ePLANETe’ concept and functionalities as an experimental platform for contributing to sustainability assessment. Now the ‘ePLANETe’ can be seen as experimentation of the challenges of “ICT for Green”. The digital technologies of ‘ePLANETe’ are exploited (i) to facilitate collaborative research, learning tools, and knowledge for sustainability challenges, and (ii) as deliberation support tools in pursuing of sustainability performance and practices in territorial governance, public policy, and business strategy, as well as in the higher education sectors itself. The paper investigates the dealing capacity of qualitative and quantitative assessment of agriculture sustainability through the stakeholder-based integrated assessment. Specifically, this paper focuses on integrating system methodologies with Deliberation Support Tools (DST) and INTEGRAAL method for collective assessment and decision-making in implementing regional plans. The report aims to identify the effective knowledge and tools to enable deliberations methodologies regarding practices on the sustainability of agriculture and biodiversity issues, societal responsibilities, and regional planning, concentrating on the question: “How to effectively mobilize resources (knowledge, tools, and methods) from different sources and at different scales regarding on agriculture and biodiversity issues to address sustainability challenges” that will create the scope for qualitative and quantitative assessments of sustainability as a new landmark of the agriculture sector.

Keywords: Biodiversity, Deliberation Support Tools, INTEGRAAL, stakeholder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 212
2555 Public Transport Planning System by Dijkstra Algorithm: Case Study Bangkok Metropolitan Area

Authors: Pimploi Tirastittam, Phutthiwat Waiyawuththanapoom

Abstract:

Nowadays the promotion of the public transportation system in the Bangkok Metropolitan Area is increased such as the “Free Bus for Thai Citizen” Campaign and the prospect of the several MRT       routes to increase the convenient and comfortable to the Bangkok Metropolitan area citizens. But citizens do not make full use of them it because the citizens are lack of the data and information and also the confident to the public transportation system of Thailand especially in the time and safety aspects. This research is the Public Transport Planning System by Dijkstra Algorithm: Case Study Bangkok Metropolitan Area by focusing on buses, BTS and MRT schedules/routes to give the most information to passengers. They can choose the way and the routes easily by using Dijkstra STAR Algorithm of Graph Theory which also shows the fare of the trip. This Application was evaluated by 30 normal users to find the mean and standard deviation of the developed system. Results of the evaluation showed that system is at a good level of satisfaction (4.20 and 0.40). From these results we can conclude that the system can be used properly and effectively according to the objective.

Keywords: Dijkstra Algorithm, Graph Theory, Shortest Route, Public Transport, Bangkok Metropolitan Area.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6309
2554 Performance Determinants for Convenience Store Suppliers

Authors: Zainah Abdullah, Aznur Hajar Abdullah

Abstract:

This paper examines the impact of information and communication technology (ICT) usage, internal relationship, supplier-retailer relationship, logistics services and inventory management on convenience store suppliers- performance. Data was collected from 275 convenience store managers in Malaysia using a set of questionnaire. The multiple linear regression results indicate that inventory management, supplier-retailer relationship, logistics services and internal relationship are predictors of supplier performance as perceived by convenience store managers. However, ICT usage is not a predictor of supplier performance. The study focuses only on convenience stores and petrol station convenience stores and concentrates only on managers. The results provide insights to suppliers who serve convenience stores and possibly similar retail format on factors to consider in improving their service to retailers. The results also provide insights to government in its aspiration to improve business operations of convenience store to consider ways to enhance the adoption of ICT by retailers and suppliers.

Keywords: Information and communication technology (ICT), internal relationship, inventory management, logistics services, supplier performance, supplier-retailer relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3989
2553 Examining Corporate Tax Evaders: Evidence from the Finalized Audit Cases

Authors: Ming Ling Lai, Zalilawati Yaacob, Normah Omar, Norashikin Abdul Aziz, Bee Wah Yap

Abstract:

This paper aims to (1) analyze the profiles of transgressors (detected evaders); (2) examine reason(s) that triggered a tax audit, causes of tax evasion, audit timeframe and tax penalty charged; and (3) to assess if tax auditors followed the guidelines as stated in the 'Tax Audit Framework' when conducting tax audits. In 2011, the Inland Revenue Board Malaysia (IRBM) had audited and finalized 557 company cases. With official permission, data of all the 557 cases were obtained from the IRBM. Of these, a total of 421 cases with complete information were analyzed. About 58.1% was small and medium corporations and from the construction industry (32.8%). The selection for tax audit was based on risk analysis (66.8%), information from third party (11.1%), and firm with low profitability or fluctuating profit pattern (7.8%). The three persistent causes of tax evasion by firms were over claimed expenses (46.8%), fraudulent reporting of income (38.5%) and overstating purchases (10.5%). These findings are consistent with past literature. Results showed that tax auditors took six to 18 months to close audit cases. More than half of tax evaders were fined 45% on additional tax raised during audit for the first offence. The study found tax auditors did follow the guidelines in the 'Tax Audit Framework' in audit selection, settlement and penalty imposition.

Keywords: Corporate tax fraud, tax non-compliance, tax evasion, tax audit, fraudulent reporting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3408
2552 Developments for ''Virtual'' Monitoring and Process Simulation of the Cryogenic Pilot Plant

Authors: Carmen Maria Moraru, Iuliana Stefan, Ovidiu Balteanu, Ciprian Bucur, Liviu Stefan, Anisia Bornea, Ioan Stefanescu

Abstract:

The implementation of the new software and hardware-s technologies for tritium processing nuclear plants, and especially those with an experimental character or of new technology developments shows a coefficient of complexity due to issues raised by the implementation of the performing instrumentation and equipment into a unitary monitoring system of the nuclear technological process of tritium removal. Keeping the system-s flexibility is a demand of the nuclear experimental plants for which the change of configuration, process and parameters is something usual. The big amount of data that needs to be processed stored and accessed for real time simulation and optimization demands the achievement of the virtual technologic platform where the data acquiring, control and analysis systems of the technological process can be integrated with a developed technological monitoring system. Thus, integrated computing and monitoring systems needed for the supervising of the technological process will be executed, to be continued with the execution of optimization system, by choosing new and performed methods corresponding to the technological processes within the tritium removal processing nuclear plants. The developing software applications is executed with the support of the program packages dedicated to industrial processes and they will include acquisition and monitoring sub-modules, named “virtually" as well as the storage sub-module of the process data later required for the software of optimization and simulation of the technological process for tritium removal. The system plays and important role in the environment protection and durable development through new technologies, that is – the reduction of and fight against industrial accidents in the case of tritium processing nuclear plants. Research for monitoring optimisation of nuclear processes is also a major driving force for economic and social development.

Keywords: Monitoring system, process simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1946
2551 Determinants of Students- Intentions to Use a Mobile Messaging Service in Educational Institutions: a Theoretical Model

Authors: Boonlert Watjatrakul

Abstract:

Mobile marketing through mobile messaging service has highly impressive growth as it enables e-business firms to communicate with their customers effectively. Educational institutions hence start using this service to enhance communication with their students. Previous studies, however, have limited understanding of applying mobile messaging service in education. This study proposes a theoretical model to understand the drivers of students- intentions to use the university-s mobile messaging service. The model indicates that social influence, perceived control and attitudes affect students- intention to use the university-s mobile messaging service. It also provides five antecedents of students- attitudes–perceived utility (information utility, entertainment utility, and social utility), innovativeness, information seeking, transaction specificity (content specificity, sender specificity, and time specificity) and privacy concern. The proposed model enables universities to understand what students concern about the use of a mobile messaging service in universities and handle the service more effectively. The paper discusses the model development and concludes with limitations and implications of the proposed model.

Keywords: education, intention, mobile marketing, mobile messaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664
2550 A New Traffic Pattern Matching for DDoS Traceback Using Independent Component Analysis

Authors: Yuji Waizumi, Tohru Sato, Yoshiaki Nemoto

Abstract:

Recently, Denial of Service(DoS) attacks and Distributed DoS(DDoS) attacks which are stronger form of DoS attacks from plural hosts have become security threats on the Internet. It is important to identify the attack source and to block attack traffic as one of the measures against these attacks. In general, it is difficult to identify them because information about the attack source is falsified. Therefore a method of identifying the attack source by tracing the route of the attack traffic is necessary. A traceback method which uses traffic patterns, using changes in the number of packets over time as criteria for the attack traceback has been proposed. The traceback method using the traffic patterns can trace the attack by matching the shapes of input traffic patterns and the shape of output traffic pattern observed at a network branch point such as a router. The traffic pattern is a shapes of traffic and unfalsifiable information. The proposed trace methods proposed till date cannot obtain enough tracing accuracy, because they directly use traffic patterns which are influenced by non-attack traffics. In this paper, a new traffic pattern matching method using Independent Component Analysis(ICA) is proposed.

Keywords: Distributed Denial of Service, Independent Component Analysis, Traffic pattern

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
2549 Multidimensional Performance Tracking

Authors: C. Ardil

Abstract:

In this study, a model, together with a software tool that implements it, has been developed to determine the performance ratings of employees in an organization operating in the information technology sector using the indicators obtained from employees' online study data. Weighted Sum (WS) Method and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method based on multidimensional decision making approach were used in the study. WS and TOPSIS methods provide multidimensional decision making (MDDM) methods that allow all dimensions to be evaluated together considering specific weights, allowing employees to objectively evaluate the problem of online performance tracking. The application of WS and TOPSIS mathematical methods, which can combine alternatives with a large number of dimensions and reach simultaneous solution, has been implemented through an online performance tracking software. In the application of WS and TOPSIS methods, objective dimension weights were calculated by using entropy information (EI) and standard deviation (SD) methods from the data obtained by employees' online performance tracking method, decision matrix was formed by using performance scores for each employee, and a single performance score was calculated for each employee. Based on the calculated performance score, employees were given a performance evaluation decision. The results of Pareto set evidence and comparative mathematical analysis validate that employees' performance preference rankings in WS and TOPSIS methods are closely related. This suggests the compatibility, applicability, and validity of the proposed method to the MDDM problems in which a large number of alternative and dimension types are taken into account. With this study, an objective, realistic, feasible and understandable mathematical method, together with a software tool that implements it has been demonstrated. This is considered to be preferable because of the subjectivity, limitations and high cost of the methods traditionally used in the measurement and performance appraisal in the information technology sector.

Keywords: Weighted sum, entropy ınformation, standard deviation, online performance tracking, performance evaluation, performance management, multidimensional decision making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1081
2548 Development of Manufacturing Simulation Model for Semiconductor Fabrication

Authors: Syahril Ridzuan Ab Rahim, Ibrahim Ahmad, Mohd Azizi Chik, Ahmad Zafir Md. Rejab, and U. Hashim

Abstract:

This research presents the development of simulation modeling for WIP management in semiconductor fabrication. Manufacturing simulation modeling is needed for productivity optimization analysis due to the complex process flows involved more than 35 percent re-entrance processing steps more than 15 times at same equipment. Furthermore, semiconductor fabrication required to produce high product mixed with total processing steps varies from 300 to 800 steps and cycle time between 30 to 70 days. Besides the complexity, expansive wafer cost that potentially impact the company profits margin once miss due date is another motivation to explore options to experiment any analysis using simulation modeling. In this paper, the simulation model is developed using existing commercial software platform AutoSched AP, with customized integration with Manufacturing Execution Systems (MES) and Advanced Productivity Family (APF) for data collections used to configure the model parameters and data source. Model parameters such as processing steps cycle time, equipment performance, handling time, efficiency of operator are collected through this customization. Once the parameters are validated, few customizations are made to ensure the prior model is executed. The accuracy for the simulation model is validated with the actual output per day for all equipments. The comparison analysis from result of the simulation model compared to actual for achieved 95 percent accuracy for 30 days. This model later was used to perform various what if analysis to understand impacts on cycle time and overall output. By using this simulation model, complex manufacturing environment like semiconductor fabrication (fab) now have alternative source of validation for any new requirements impact analysis.

Keywords: Advanced Productivity Family (APF), Complementary Metal Oxide Semiconductor (CMOS), Manufacturing Execution Systems (MES), Work In Progress (WIP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3173
2547 A Methodology for Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and cloud computing, we mostly rely on the machine and natural language processing capabilities of AI, and energy efficient hardware and software devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and to sustain the depletion of natural resources. The core pillars of sustainability are Economic, Environmental, and Social, which are also informally referred to as 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core sustainability model in the enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand there is also a growing concern in many industries on how to reduce carbon emission and conserve natural resources while adopting sustainability in the corporate business models and policies. In our paper, we would like to discuss the driving forces such as climate changes, natural disasters, pandemic, disruptive technologies, corporate policies, scaled business models and emerging social media and AI platforms that influence the 3 main pillars of sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increase recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (shared IT services, cloud computing and application modernization) with the vision for a sustainable environment.

Keywords: AI, cloud computing, machine learning, social media platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156
2546 Semantically Enriched Web Usage Mining for Personalization

Authors: Suresh Shirgave, Prakash Kulkarni, José Borges

Abstract:

The continuous growth in the size of the World Wide Web has resulted in intricate Web sites, demanding enhanced user skills and more sophisticated tools to help the Web user to find the desired information. In order to make Web more user friendly, it is necessary to provide personalized services and recommendations to the Web user. For discovering interesting and frequent navigation patterns from Web server logs many Web usage mining techniques have been applied. The recommendation accuracy of usage based techniques can be improved by integrating Web site content and site structure in the personalization process.

Herein, we propose semantically enriched Web Usage Mining method for Personalization (SWUMP), an extension to solely usage based technique. This approach is a combination of the fields of Web Usage Mining and Semantic Web. In the proposed method, we envisage enriching the undirected graph derived from usage data with rich semantic information extracted from the Web pages and the Web site structure. The experimental results show that the SWUMP generates accurate recommendations and is able to achieve 10-20% better accuracy than the solely usage based model. The SWUMP addresses the new item problem inherent to solely usage based techniques.

Keywords: Prediction, Recommendation, Semantic Web Usage Mining, Web Usage Mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3005
2545 Hospital Administration for Humanized Healthcare in Thailand

Authors: Niwatchai Namwichisirikul

Abstract:

Due to the emergence of “Humanized Healthcare" introduced by Professor Dr. Prawase Wasi in 2003[1], the development of this paradigm tends to be widely implemented. The organizations included Healthcare Accreditation Institute (public organization), National Health Foundation, Mahidol University in cooperation with Thai Health Promotion Foundation, and National Health Security Office (Thailand) have selected the hospitals or infirmaries that are qualified for humanized healthcare since 2008- 2010 and 35 of them are chosen to be the outstandingly navigating organizations for the development of humanized healthcare, humanized healthcare award [2]. The research aims to study the current issue, characteristics and patterns of hospital administration contributing to humanized healthcare system in Thailand. The selected case studies are from four hospitals including Dansai Crown Prince Hospital, Leoi; Ubolrattana Hospital, Khon Kaen; Kapho Hospital, Pattani; and Prathai Hospital, Nakhonrachasima. The methodology is in-depth interviewing with 10 staffs working as hospital executive directors, and representatives from leader groups including directors, multidisciplinary hospital committees, personnel development committees, physicians and nurses in each hospital. (Total=40) In addition, focus group discussions between hospital staffs and general people (including patients and their relatives, the community leader, and other people) are held by means of setting 4 groups including 8 people within each group. (Total=128) The observation on the working in each hospital is also implemented. The findings of the study reveal that there are five important aspects found in each hospital including (1) the quality improvement under the mental and spiritual development policy from the chief executives and lead teams, leaders as Role model and they have visionary leadership; (2) the participation hospital administration system focusing on learning process and stakeholder- needs, spiritual human resource management and development; (3) the relationship among people especially staffs, team work skills, mutual understanding, effective communication and personal inner-development; (4) organization culture relevant to the awareness of patients- rights as well as the participation policy including spiritual growth achieving to the same goals, sharing vision, developing public mind, and caring; and (5) healing structures or environment providing warmth and convenience for hospital staffs, patients and their relatives and visitors.

Keywords: Hospital administration, Humanized healthcare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2465
2544 Automatic Extraction of Roads from High Resolution Aerial and Satellite Images with Heavy Noise

Authors: Yan Li, Ronald Briggs

Abstract:

Aerial and satellite images are information rich. They are also complex to analyze. For GIS systems, many features require fast and reliable extraction of roads and intersections. In this paper, we study efficient and reliable automatic extraction algorithms to address some difficult issues that are commonly seen in high resolution aerial and satellite images, nonetheless not well addressed in existing solutions, such as blurring, broken or missing road boundaries, lack of road profiles, heavy shadows, and interfering surrounding objects. The new scheme is based on a new method, namely reference circle, to properly identify the pixels that belong to the same road and use this information to recover the whole road network. This feature is invariable to the shape and direction of roads and tolerates heavy noise and disturbances. Road extraction based on reference circles is much more noise tolerant and flexible than the previous edge-detection based algorithms. The scheme is able to extract roads reliably from images with complex contents and heavy obstructions, such as the high resolution aerial/satellite images available from Google maps.

Keywords: Automatic road extraction, Image processing, Feature extraction, GIS update, Remote sensing, Geo-referencing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
2543 An Analytical Study on the Politics of Defection in India

Authors: Diya Sarkar, Prafulla C. Mishra

Abstract:

In a parliamentary system, party discipline is the impulse; when it falls short, the government usually falls. Conceivably, the platform of Indian politics suffers with innumerous practical disorders. The politics of defection is one such specie entailing gross miscarriage of fair conduct turning politics into a game of thrones (powers). This practice of political nomaditude can trace its seed in the womb of British House of Commons. Therein, if a legislator was found to cross the floor, the party considered him disloyal. In other words, the legislator lost his allegiance to his former party by joining another party. This very phenomenon, in practice has a two way traffic i.e. ruling party to the opposition party or vice versa. The democracies like USA, Australia and Canada were also aware of this fashion of swapping loyalties. There have been several instances of great politicians changing party allegiance, for example Winston Churchill, Ramsay McDonald, William Gladstone etc. Nevertheless, it is interesting to cite that irrespective of such practice of changing party allegiance, none of the democracies in the west ever desired or felt the need to legislatively ban defections. But, exceptionally India can be traced to have passed anti-defection laws. The politics of defection had been a unique popular phenomenon on the floor of Indian Parliamentary system gradually gulping the democratic essence and synchronization of the Federation. This study is both analytical and doctrinal, which tries to examine whether representative democracy has lost its essence due to political nomadism. The present study also analyzes the classical as well as contemporary pulse of floor crossing amidst dynastic politics in a representative democracy. It will briefly discuss the panorama of defections under the Indian federal structure in the light of the anti-defection law and an attempt has been made to add valuable suggestions to streamline remedy for the still prevalent political defections.

Keywords: Constitutional law, defection, democracy, political anti-trust.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2684
2542 Parkinsons Disease Classification using Neural Network and Feature Selection

Authors: Anchana Khemphila, Veera Boonjing

Abstract:

In this study, the Multi-Layer Perceptron (MLP)with Back-Propagation learning algorithm are used to classify to effective diagnosis Parkinsons disease(PD).It-s a challenging problem for medical community.Typically characterized by tremor, PD occurs due to the loss of dopamine in the brains thalamic region that results in involuntary or oscillatory movement in the body. A feature selection algorithm along with biomedical test values to diagnose Parkinson disease.Clinical diagnosis is done mostly by doctor-s expertise and experience.But still cases are reported of wrong diagnosis and treatment. Patients are asked to take number of tests for diagnosis.In many cases,not all the tests contribute towards effective diagnosis of a disease.Our work is to classify the presence of Parkinson disease with reduced number of attributes.Original,22 attributes are involved in classify.We use Information Gain to determine the attributes which reduced the number of attributes which is need to be taken from patients.The Artificial neural networks is used to classify the diagnosis of patients.Twenty-Two attributes are reduced to sixteen attributes.The accuracy is in training data set is 82.051% and in the validation data set is 83.333%.

Keywords: Data mining, classification, Parkinson disease, artificial neural networks, feature selection, information gain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3742
2541 Influence of Ambiguity Cluster on Quality Improvement in Image Compression

Authors: Safaa Al-Ali, Ahmad Shahin, Fadi Chakik

Abstract:

Image coding based on clustering provides immediate access to targeted features of interest in a high quality decoded image. This approach is useful for intelligent devices, as well as for multimedia content-based description standards. The result of image clustering cannot be precise in some positions especially on pixels with edge information which produce ambiguity among the clusters. Even with a good enhancement operator based on PDE, the quality of the decoded image will highly depend on the clustering process. In this paper, we introduce an ambiguity cluster in image coding to represent pixels with vagueness properties. The presence of such cluster allows preserving some details inherent to edges as well for uncertain pixels. It will also be very useful during the decoding phase in which an anisotropic diffusion operator, such as Perona-Malik, enhances the quality of the restored image. This work also offers a comparative study to demonstrate the effectiveness of a fuzzy clustering technique in detecting the ambiguity cluster without losing lot of the essential image information. Several experiments have been carried out to demonstrate the usefulness of ambiguity concept in image compression. The coding results and the performance of the proposed algorithms are discussed in terms of the peak signal-tonoise ratio and the quantity of ambiguous pixels.

Keywords: Ambiguity Cluster, Anisotropic Diffusion, Fuzzy Clustering, Image Compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
2540 Multilevel Activation Functions For True Color Image Segmentation Using a Self Supervised Parallel Self Organizing Neural Network (PSONN) Architecture: A Comparative Study

Authors: Siddhartha Bhattacharyya, Paramartha Dutta, Ujjwal Maulik, Prashanta Kumar Nandi

Abstract:

The paper describes a self supervised parallel self organizing neural network (PSONN) architecture for true color image segmentation. The proposed architecture is a parallel extension of the standard single self organizing neural network architecture (SONN) and comprises an input (source) layer of image information, three single self organizing neural network architectures for segmentation of the different primary color components in a color image scene and one final output (sink) layer for fusion of the segmented color component images. Responses to the different shades of color components are induced in each of the three single network architectures (meant for component level processing) by applying a multilevel version of the characteristic activation function, which maps the input color information into different shades of color components, thereby yielding a processed component color image segmented on the basis of the different shades of component colors. The number of target classes in the segmented image corresponds to the number of levels in the multilevel activation function. Since the multilevel version of the activation function exhibits several subnormal responses to the input color image scene information, the system errors of the three component network architectures are computed from some subnormal linear index of fuzziness of the component color image scenes at the individual level. Several multilevel activation functions are employed for segmentation of the input color image scene using the proposed network architecture. Results of the application of the multilevel activation functions to the PSONN architecture are reported on three real life true color images. The results are substantiated empirically with the correlation coefficients between the segmented images and the original images.

Keywords: Colour image segmentation, fuzzy set theory, multi-level activation functions, parallel self-organizing neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2004
2539 Palmprint Recognition by Wavelet Transform with Competitive Index and PCA

Authors: Deepti Tamrakar, Pritee Khanna

Abstract:

This manuscript presents, palmprint recognition by combining different texture extraction approaches with high accuracy. The Region of Interest (ROI) is decomposed into different frequencytime sub-bands by wavelet transform up-to two levels and only the approximate image of two levels is selected, which is known as Approximate Image ROI (AIROI). This AIROI has information of principal lines of the palm. The Competitive Index is used as the features of the palmprint, in which six Gabor filters of different orientations convolve with the palmprint image to extract the orientation information from the image. The winner-take-all strategy is used to select dominant orientation for each pixel, which is known as Competitive Index. Further, PCA is applied to select highly uncorrelated Competitive Index features, to reduce the dimensions of the feature vector, and to project the features on Eigen space. The similarity of two palmprints is measured by the Euclidean distance metrics. The algorithm is tested on Hong Kong PolyU palmprint database. Different AIROI of different wavelet filter families are also tested with the Competitive Index and PCA. AIROI of db7 wavelet filter achievs Equal Error Rate (EER) of 0.0152% and Genuine Acceptance Rate (GAR) of 99.67% on the palm database of Hong Kong PolyU.

Keywords: DWT, EER, Euclidean Distance, Gabor filter, PCA, ROI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718
2538 An Overview of the Islamic Banking Development in the United Kingdom, Malaysia, Saudi Arabia, Iran, Nigeria, Kenya and Uganda

Authors: Pradeep Kulshrestha, Maulana Ayoub Ali

Abstract:

The level of penetration of Islamic banking products and services has recorded a reasonable growth at an exponential rate in many parts of the world. There are many factors which have contributed to this growth including, but not limited to the rapid growth of number of Muslims who are uncomfortable with the conventional ways of banking, interest and higher interest rates scheduled by conventional banks and financial institutions as well as the financial inclusion campaign conducted in many countries. The system is facing legal challenges which open the research fdoor for practitioners and academicians for the sake of finding out solutions to those challenges. This paper tries to investigate the development of the Islamic banking system in the United Kingdom (UK), Saudi Arabia, Malaysia, Iran, Kenya, Nigeria and Uganda in order to understand the modalities which have been employed to run an Islamic banking system in the aforementioned countries. The methodology which has been employed in doing this research paper is Doctrinal, of which legislations, policies and other legal tools have been carefully studied and analysed. Again, papers from academic journals, books and financial reports have been deeply analysed for the purpose of enriching the paper and come up with a tangible results. The paper found that in Asia, Malaysia has created the smoothest legal platform for Islamic banking system to work properly in the country. The United Kingdom has tried harder to smooth the banking system without affecting the conventional banking methods and without favouring the operations of Islamic banks. It also tries harder to make UK as an Islamic banking and finance hub in Europe. The entire banking system in Iran is Islamic, while Nigeria has undergone several legal reforms to suit Islamic banking system in the country. Kenya and Uganda are at a different pace in making Islamic Banking system work alongside the conventional banking system.  

Keywords: Shariah, Islamic banking, law, alternative banking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
2537 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: Database, forensic genetics, genetic analysis, sample management, software solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1139
2536 Effectual Reversible Watermarking Method for Hide the Patient Details in Brain Tumor Image

Authors: K. Amudha, C. Nelson Kennedy Babu, S. Balu

Abstract:

The security of the medical images and its related data is the major research area which is to be concentrated in today’s era. Security in the medical image indicates that the physician may hide patients’ related data in the medical image and transfer it safely to a defined location using reversible watermarking. Many reversible watermarking methods had proposed over the decade. This paper enhances the security level in brain tumor images to hide the patient’s detail, which has to be conferred with other physician’s suggestions. The details or the information will be hidden in Non-ROI area of the image by using the block cipher algorithm. The block cipher uses different keys to extract the details that are difficult for the intruder to detect all the keys and to spot the details, which are the key advantage of this method. The ROI is the tumor area and Non-ROI is the area rest of ROI. The Non-ROI should not be spoiled in any cause and the details in the Non-ROI should be extracted correctly. The reversible watermarking method proposed in this paper performs well when compared to existing methods in the process of extraction of an original image and providing information security.

Keywords: Brain tumor images, Block Cipher, Reversible watermarking, ROI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
2535 The Libyan Accounting Profession

Authors: Bubaker F. Shareia

Abstract:

The aim of this paper is to trace the historical development of the accounting profession in Libya, in order to identify challenges facing the profession as the country moves from a closed to emerging economy. The study is based on a literature review and archival research. Accounting information has a vital role to play in the achievement of economic goals in developing and emerging economies, but a well qualified accounting profession is required. In the context of institutional instability and unique cultural factors, the accounting profession in Libya faces educational and legal challenges if it is to achieve its potential in assisting the country to reach its economic goals. This study focuses on one country, which does limit its generalisability. However, it also suggests fruitful research areas in considering the impact and challenge of historic factors on the accounting profession in emerging economies. Centrally planned economies require a body of well trained professional accountants if they are to emerge onto the global economic arena. Studies on the accounting profession have focused primarily on those in developed economies, where the need for meaningful accounting information for decision making is taken for granted and there is a well trained, professional workforce. This study of the profession in an emerging economy highlights the efforts that will be needed to ensure the contribution of the profession to the economic wellbeing of other emerging economies.

Keywords: Accounting profession, developing countries, culture, planned economy, emerging economy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3470
2534 Telecommunications Access, Social Capital and Sustainable Development

Authors: Susan.Bandias

Abstract:

This paper examines the role of telecommunications in sustainable development of urban, rural and remote communities in the Northern Territory of Australia through the theoretical lens of Social Capital. Social Capital is a relatively new construct and is rapidly gaining interest among policy makers, politicians and researchers as a means to both describe and understand social and economic development. Increasingly, the concept of Social Capital, as opposed to the traditional economic indicators, is seen as a more accurate measure of well-being. Whilst the essence of Social Capital is quality social relations, the concept intersects with telecommunications and Information Communications Technology (ICT) in a number of ways. The potential of ICT to disseminate information quickly, to reach vast numbers of people simultaneously and to include the previously excluded, is immense. However, the exact nature of the relationship is not clearly defined. This paper examines the nexus between social relations of mutual benefit, telecommunications access and sustainable development. A mixed methodological approach was used to test the hypothesis that No relationship exists between Social Capital and access to telecommunications services and facilities. Four communities, which included two urban, a rural and a remote Indigenous community in the Northern Territory of Australia are the focus of this research paper.

Keywords: Indigenous disadvantage, Social Capital, sustainable development, telecommunications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
2533 A Theory-Based Analysis on Implications of Democracy in Cambodia

Authors: Puthsodary Tat

Abstract:

Democracy has been categorially accepted and used as foreign and domestic policy agendas for the hope of peace, economic growth and prosperity for more than 25 years in Cambodia. However, the country is now in the grip of dictatorship, human rights violations, and prospective economic sanctions. This paper examines different perceptions and experiences of democratic assistance. In this study, the author employs discourse theory, idealism and realism as a theory-based methodology for debating and assessing the implications of democratization. Discourse theory is used to establish a platform for understanding discursive formations, body of knowledge and the games of truth of democracy. Idealist approaches give rational arguments for adopting key tenets that work well on the ground. In contrast, realism allows for some sweeping critiques of utopian ideal and offers particular views on why Western hegemonic missions do not work well. From idealist views, the research finds that Cambodian people still believe that democracy is a prima facie universality for peace, growth and prosperity. From realism, democratization is on the brink of death in three reasons. Firstly, there are tensions between Western and local discourses about democratic values and norms. Secondly, democratic tenets have been undermined by the ruling party-controlled courts, corruption, structural oppression and political patronage-based institutions. The third pitfall is partly associated with foreign aid dependency and geopolitical power struggles in the region. Finally, the study offers a precise mosaic of democratic principles that may be used to avoid a future geopolitical and economic crisis.

Keywords: Corruption, democracy, democratic principles, discourse theory, discursive formations, foreign aid dependency, games of truth, geopolitical and economic crisis, geopolitical power struggle, hegemonic mission, idealism, realism, utopian ideal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 951
2532 Theoretical Analysis of Capacities in Dynamic Spatial Multiplexing MIMO Systems

Authors: Imen Sfaihi, Noureddine Hamdi

Abstract:

In this paper, we investigate the study of techniques for scheduling users for resource allocation in the case of multiple input and multiple output (MIMO) packet transmission systems. In these systems, transmit antennas are assigned to one user or dynamically to different users using spatial multiplexing. The allocation of all transmit antennas to one user cannot take full advantages of multi-user diversity. Therefore, we developed the case when resources are allocated dynamically. At each time slot users have to feed back their channel information on an uplink feedback channel. Channel information considered available in the schedulers is the zero forcing (ZF) post detection signal to interference plus noise ratio. Our analysis study concerns the round robin and the opportunistic schemes. In this paper, we present an overview and a complete capacity analysis of these schemes. The main results in our study are to give an analytical form of system capacity using the ZF receiver at the user terminal. Simulations have been carried out to validate all proposed analytical solutions and to compare the performance of these schemes.

Keywords: MIMO, scheduling, ZF receiver, spatial multiplexing, round robin scheduling, opportunistic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1294