Search results for: heterogeneous massive data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25411

Search results for: heterogeneous massive data

24601 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 316
24600 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction

Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto

Abstract:

Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.

Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data

Procedia PDF Downloads 96
24599 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models

Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling

Abstract:

Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.

Keywords: supplier selection, automotive supply chains, ANN, GEP

Procedia PDF Downloads 620
24598 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method

Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito

Abstract:

In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.

Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.

Procedia PDF Downloads 489
24597 A Near Ambient Pressure X-Ray Photoelectron Spectroscopy Study on Platinum Nanoparticles Supported on Zr-Based Metal Organic Frameworks

Authors: Reza Vakili, Xiaolei Fan, Alex Walton

Abstract:

The first near ambient pressure (NAP)-XPS study of CO oxidation over Pt nanoparticles (NPs) incorporated into Zr-based UiO (UiO for Universitetet i Oslo) MOFs was carried out. For this purpose, the MOF-based Catalysts were prepared by wetness impregnation (WI-PtNPs@UiO-67) and linker design (LD-PtNPs@UiO-67) methods along with PtNPs@ZrO₂ as the control catalyst. Firstly, the as-synthesized catalysts were reduced in situ prior to the operando XPS analysis. The existence of Pt(II) species was proved in UiO-67 by observing Pt 4f core level peaks at a high binding energy of 72.6 ± 0.1 eV. However, by heating the WI-PtNPs@UiO-67 catalyst in situ to 200 °C under vacuum, the higher BE components disappear, leaving only the metallic Pt 4f doublet, confirming the formation of Pt NPs. The complete reduction of LD-PtNPs@UiO-67 is achieved at 250 °C and 1 mbar H₂. To understand the chemical state of Pt NPs in UiO-67 during catalytic turnover, we analyzed the Pt 4f region using operando NAP-XPS in the temperature-programmed measurements (100-260 °C) with reference to PtNPs@ZrO₂ catalyst. CO conversion during NAP-XPS experiments with the stoichiometric mixture shows that LD-PtNPs@UiO-67 has a better CO turnover frequency (TOF, 0.066 s⁻¹ at 260 °C) than the other two (ca. 0.055 s⁻¹). Pt 4f peaks only show one chemical species present at all temperatures, but the core level BE shifts change as a function of reaction temperature, i.e., Pt 4f peak from 71.8 eV at T < 200 °C to 71.2 eV at T > 200 °C. As this higher BE state of 71.8 eV was not observed after in situ reductions of the catalysts and only once the CO/O₂ mixture was introduced, we attribute it to the surface saturation of Pt NPs with adsorbed CO. In general, the quantitative analysis of Pt 4f data from the operando NAP-XPS experiments shows that the surface chemistry of the Pt active phase in the two PtNPs@UiO-67 catalysts is the same, comparable to that of PtNPs@ZrO₂. The observed difference in the catalytic activity can be attributed to the particle sizes of Pt NPs, as well as the dispersion of active phase in the support, which are different in the three catalysts.

Keywords: CO oxidation, heterogeneous catalysis, MOFs, Metal Organic Frameworks, NAP-XPS, Near Ambient Pressure X-ray Photoelectron Spectroscopy

Procedia PDF Downloads 128
24596 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 146
24595 A Method Development for Improving the Efficiency of Solid Waste Collection System Using Network Analyst

Authors: Dhvanidevi N. Jadeja, Daya S. Kaul, Anurag A. Kandya

Abstract:

Municipal Solid Waste (MSW) collection in a city is performed in less effective manner which results in the poor management of the environment and natural resources. Municipal corporation does not possess efficient waste management and recycling programs because of the complex task involving many factors. Solid waste collection system depends upon various factors such as manpower, number and size of vehicles, transfer station size, dustbin size and weight, on-road traffic, and many others. These factors affect the collection cost, energy and overall municipal tax for the city. Generally, different types of waste are scattered throughout the city in a heterogeneous way that poses changes for efficient collection of solid waste. Efficient waste collection and transportation strategy must be effectively undertaken which will include optimization of routes, volume of waste, and manpower. Being these optimized, the overall cost can be reduced as the fuel and energy requirements would be less and also the municipal waste taxes levied will be less. To carry out the optimization study of collection system various data needs to be collected from the Ahmedabad municipal corporation such as amount of waste generated per day, number of workers, collection schedule, road maps, number of transfer station, location of transfer station, number of equipment (tractors, machineries), number of zones, route of collection etc. The ArcGis Network Analyst is introduced for the best routing identification applied in municipal waste collection. The simulation consists of scenarios of visiting loading spots in the municipality of Ahmedabad, considering dynamic factors like network traffic changes, closed roads due to natural or technical causes. Different routes were selected in a particular area of Ahmedabad city, and present routes were optimized to reduce the length of the routes, by using ArcGis Network Analyst. The result indicates up to 35% length minimization in the routes.

Keywords: collection routes, efficiency, municipal solid waste, optimization

Procedia PDF Downloads 126
24594 Engineering Method to Measure the Impact Sound Improvement with Floor Coverings

Authors: Katarzyna Baruch, Agata Szelag, Jaroslaw Rubacha, Bartlomiej Chojnacki, Tadeusz Kamisinski

Abstract:

Methodology used to measure the reduction of transmitted impact sound by floor coverings situated on a massive floor is described in ISO 10140-3: 2010. To carry out such tests, the standardised reverberation room separated by a standard floor from the second measuring room are required. The need to have a special laboratory results in high cost and low accessibility of this measurement. The authors propose their own engineering method to measure the impact sound improvement with floor coverings. This method does not require standard rooms and floor. This paper describes the measurement procedure of proposed engineering method. Further, verification tests were performed. Validation of the proposed method was based on the analytical model, Statistical Energy Analysis (SEA) model and empirical measurements. The received results were related to corresponding ones obtained from ISO 10140-3:2010 measurements. The study confirmed the usefulness of the engineering method.

Keywords: building acoustic, impact noise, impact sound insulation, impact sound transmission, reduction of impact sound

Procedia PDF Downloads 316
24593 New Heterogenous α-Diimine Nickel (II)/ MWCNT Catalysts for Ethylene Polymerization

Authors: Sasan Talebnezhad, Saeed Pormahdian, Naghi Assali

Abstract:

Homogeneous α-diimine nickel (II) catalyst complexes, with and without amino para-aryl position functionality, were synthesized. These complexes were immobilized on carboxyl, hydroxyl, and acyl chloride functionalized multi-walled carbon nanotubes to form five novel heterogeneous α-diiminonickel catalysts. Immobilization was performed by covalent or electrostatic bonding via methylaluminoxane (MAO) linker or amide linkage. Both the nature of α-diimine ligands and the kind of interaction between anchored catalyst complexes and multi-walled carbon nanotube surface influenced the catalytic performance, microstructure, and morphology of obtained polyethylenes. The catalyst prepared by amide bonding showed lowest relative weight loss in thermogravimetry analysis and highest activities up to 5863 gr PE mmol-1Ni.hr-1. This catalyst produced polyethylene with dense botryoidal morphology.

Keywords: α-diimine nickel (II) complexes, immobilization, multi-walled carbon nanotubes, ethylene polymerization

Procedia PDF Downloads 402
24592 New Heterogenous α-Diimine Nickel (II)/MWCNT Catalysts for Ethylene Polymerization

Authors: Sasan Talebnezhad, Saeed Pourmahdian, Naghi Assali

Abstract:

Homogeneous α-diimine nickel (II) catalyst complexes, with and without amino para-aryl position functionality, were synthesized. These complexes were immobilized on carboxyl, hydroxyl and acyl chloride functionalized multi-walled carbon nanotubes to form five novel heterogeneous α diiminonickel catalysts. Immobilization was performed by covalent or electrostatic bonding via methylaluminoxane (MAO) linker or amide linkage. Both the nature of α-diimine ligands and the kind of interaction between anchored catalyst complexes and multi-walled carbon nanotube surface influenced the catalytic performance, microstructure, and morphology of obtained polyethylenes. The catalyst prepared by amide bonding showed lowest relative weight loss in thermogravimetry analysis and highest activities up to 5863 gr PE mmol-1Ni.hr-1. This catalyst produced polyethylene with dense botryoidal morphology.

Keywords: α-diimine nickel (II) complexes, immobilization, multi-walled carbon nanotubes, ethylene polymerization

Procedia PDF Downloads 494
24591 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 329
24590 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 425
24589 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 396
24588 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 734
24587 The Behavior of Masonry Wall Constructed Using Biaxial Interlocking Concrete Block, Solid Concrete Block and Cement Sand Brick Subjected to the Compressive Load

Authors: Fauziah Aziz, Mohd.fadzil Arshad, Hazrina Mansor, Sedat Kömürcü

Abstract:

Masonry is an isotropic and heterogeneous material due to the presence of the different components within the assembly process. Normally the mortar plays a significant role in the compressive behavior of the traditional masonry structures. Biaxial interlocking concrete block is a masonry unit that comes out with the interlocking concept. This masonry unit can improve the quality of the construction process, reduce the cost of labor, reduce high skill workmanship, and speeding the construction time. Normally, the interlocking concrete block masonry unit in the market place was designed in a way interlocking concept only either x or y-axis, shorter in length, and low compressive strength value. However, the biaxial interlocking concrete block is a dry-stack concept being introduced in this research, offered the specialty compared to the normal interlocking concrete available in the market place due to its length and the geometry of the groove and tongue. This material can be used as a non-load bearing wall, or load-bearing wall depends on the application of the masonry. But, there is a lack of technical data that was produced before. This paper presents a finding on the compressive resistance of the biaxial interlocking concrete block masonry wall compared to the other traditional masonry walls. Two series of biaxial interlocking concrete block masonry walls, namely M1 and M2, a series of solid concrete block and cement sand brick walls M3, and M4 have tested the compressive resistance. M1 is the masonry wall of a hollow biaxial interlocking concrete block meanwhile; M2 is the grouted masonry wall, M3 is a solid concrete block masonry wall, and M4 is a cement sand brick masonry wall. All the samples were tested under static compressive load. The results examine that M2 is higher in compressive resistance compared to the M1, M3, and M4. It shows that the compressive strength of the concrete masonry units plays a significant role in the capacity of the masonry wall.

Keywords: interlocking concrete block, compressive resistance, concrete masonry unit, masonry

Procedia PDF Downloads 158
24586 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 226
24585 Unraveling the Political Complexities of the Textile and Clothing Waste Ecosystem; A Case Study on Melbourne Metropolitan Civic Waste Management Practices

Authors: Yasaman Samie

Abstract:

The ever-increasing rate of textile and clothing (T&C) waste generation and the common ineffective waste management practices have been for long a challenge for civic waste management. This challenge stems from not only the complexity in the T&C material components but also the heterogeneous nature of the T&C waste management sector and the disconnection between the stakeholders. To date, there is little research that investigates the importance of a governmental structure and its role in T&C waste managerial practices and decision makings. This paper reflects on the impacts and involvement of governments, the Acts, and legislation on the effectiveness of T&C waste management practices, which are carried out by multiple players in a city context. In doing so, this study first develops a methodical framework for holistically analyzing a city’s T&C waste ecosystem. Central to this framework are six dimensions: social, environmental, economic, political, cultural, and educational, as well as the connection between these dimensions such as Socio-Political and Cultural-Political. Second, it delves into the political dimension and its interconnections with varying aspects of T&C waste. In this manner, this case-study takes metropolitan Melbourne as a case and draws on social theories of Actor-Network Theory and the principals of supply chain design and planning. Data collection was through two rounds of semi-structured interviews with 18 key players of T&C waste ecosystem (including charities, city councils, private sector providers and producers) mainly within metropolitan Melbourne and also other Australian and European cities. Research findings expand on the role of the politics of waste in facilitating a proactive approach to T&C waste management in the cities. That is achieved through a revised definition for T&C waste and its characteristics, discussing the varying perceptions of value in waste, prioritizing waste types in civic waste management practices and how all these aspects shall be reflected in the in-placed acts and legislations.

Keywords: civic waste management, multi-stakeholder ecosystem, textile and clothing waste, waste and governments

Procedia PDF Downloads 105
24584 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 405
24583 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 268
24582 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 420
24581 Credit Cooperatives: A Factor for Improving the Sustainable Management of Private Forests

Authors: Todor Nickolov Stoyanov

Abstract:

Cooperatives are present in all countries and in almost all sectors, including agriculture, forestry, food, finance, health, marketing, insurance and credit. Strong cooperatives are able to overcome many of the difficulties faced by private owners. Cooperatives use seven principles, including the 'Community Concern" principle, which enables cooperatives to work for the sustainable development of the community. The members of cooperatives may use different systems for generating year-round employment and for receiving sustainable income through performing different forestry activities. Various methods are used during the preparation of the report. These include literature reviews, statistics, secondary data and expert interviews. The members of the cooperatives are benefits exclusively from increasing the efficiency of the various products and from the overall yield of the harvest, and ultimately from achieving better profit through cooperative efforts. Cooperatives also use other types of activities that are an additional opportunity for cooperative income. There are many heterogeneous activities in the production and service sectors of the forest cooperatives under consideration. Some cooperatives serve dairies, distilleries, woodworking enterprises, tourist homes, hotels and motels, shops, ski slopes, sheep breeding, etc. Through the revenue generated by the activity, cooperatives have the opportunity to carry out various environmental and protective activities - recreation, water protection, protection of endangered and endemic species, etc., which in the case of small-scale forests cannot be achieved and the management is not sustainable. The conclusions indicate the results received in the analysis. Cooperative management of forests and forest lands gives higher incomes to individual owners. The management of forests and forest lands through cooperatives helps to carry out different environmental and protective activities. Cooperative forest management provides additional means of subsistence to the owners of poor forest lands. Cooperative management of forests and forest lands support owners to implement the forest management plans and to apply sustainable management of these territories.

Keywords: cooperative, forestry, forest owners, principles of cooperation

Procedia PDF Downloads 232
24580 Data Mining As A Tool For Knowledge Management: A Review

Authors: Maram Saleh

Abstract:

Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.

Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.

Procedia PDF Downloads 198
24579 Role of Institutional Quality as a Key Determinant of FDI Flows in Developing Asian Economies

Authors: Bikash Ranjan Mishra, Lopamudra D. Satpathy

Abstract:

In the wake of the phenomenal surge in international business in the last decades or more, both the developed and developing economies around the world are in massive competition to attract more and more FDI flows. While the developed countries have marched ahead in the race, the developing countries, especially those of Asian economies, have followed them at a rapid pace. While most of the previous studies have analysed the role of institutional quality in the promotion of FDI flows in developing countries, very few studies have taken an integrated approach of examining the comprehensive impact of institutional quality, globalization pattern and domestic financial development on FDI flows. In this context, the paper contributes to the literature in two important ways. Firstly, two composite indices of institutional quality and domestic financial development for the Asian countries are constructed in comparison to earlier studies that resort to a single variable for indicating the institutional quality and domestic financial development. Secondly, the impact of these variables on FDI flows through their interaction with geographical region is investigated. The study uses panel data covering the time period of 1996 to 2012 by selecting twenty Asian developing countries by emphasizing the quality of institutions from the geographical regions of eastern, south-eastern, southern and western Asia. Control of corruption, better rule of law, regulatory quality, effectiveness of the government, political stability and voice and accountability are used as indicators of institutional quality. Besides these, the study takes into account the domestic credits in the hands of public, private sectors and in stock markets as domestic financial indicators. First in the specification of model, a factor analysis is performed to reduce the vast determinants, which are highly correlated with each other, to a manageable size. Afterwards, a reduced version of the model is estimated with the extracted factors in the form of index as independent variables along with a set of control variables. It is found that the institutional quality index and index of globalization exert a significant effect on FDI inflows of the host countries; in contrast, the domestic financial index does not seem to play much worthy role. Finally, some robustness tests are performed to make sure that the results are not sensitive to temporal and spatial unobserved heterogeneity. On the basis of the above study, one general inference can be drawn from the policy prescription point of view that the government of these developing countries should strengthen their domestic institution, both financial and non-financial. In addition to these, welfare policies should also target for rapid globalization. If the financial and non-financial institutions of these developing countries become sound and grow more globalized in the economic, social and political domain, then they can appeal to more amounts of FDI inflows that will subsequently result in advancement of these economies.

Keywords: Asian developing economies, FDI, institutional quality, panel data

Procedia PDF Downloads 300
24578 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 43
24577 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 470
24576 A Preliminary Study of Urban Resident Space Redundancy in the Context of Rapid Urbanization: Based on Urban Research of Hongkou District of Shanghai

Authors: Ziwei Chen, Yujiang Gao

Abstract:

The rapid urbanization has caused the massive physical space in Chinese cities to be in a state of duplication and dislocation through the rapid development, forming many daily spaces that cannot be standardized, typed, and identified, such as illegal construction. This phenomenon is known as urban spatial redundancy and is often excluded from mainstream architectural discussions because of its 'remaining' and 'excessive' derogatory label. In recent years, some practice architects have begun to pay attention to this phenomenon and tried to tap the value behind it. In this context, the author takes the redundancy phenomenon of resident space as the research object and explores the inspiration to the urban architectural renewal and the innovative residential area model, based on the urban survey of redundant living space in Hongkou District of Shanghai. On this basis, it shows that the changes accumulated in the long-term use of the building can be re-applied to the goals before the design, which is an important link and significance of the existence of an architecture.

Keywords: rapid urbanization, living space redundancy, architectural renewal, residential area model

Procedia PDF Downloads 125
24575 Forecasting Stock Indexes Using Bayesian Additive Regression Tree

Authors: Darren Zou

Abstract:

Forecasting the stock market is a very challenging task. Various economic indicators such as GDP, exchange rates, interest rates, and unemployment have a substantial impact on the stock market. Time series models are the traditional methods used to predict stock market changes. In this paper, a machine learning method, Bayesian Additive Regression Tree (BART) is used in predicting stock market indexes based on multiple economic indicators. BART can be used to model heterogeneous treatment effects, and thereby works well when models are misspecified. It also has the capability to handle non-linear main effects and multi-way interactions without much input from financial analysts. In this research, BART is proposed to provide a reliable prediction on day-to-day stock market activities. By comparing the analysis results from BART and with time series method, BART can perform well and has better prediction capability than the traditional methods.

Keywords: BART, Bayesian, predict, stock

Procedia PDF Downloads 118
24574 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 382
24573 An Examination of Low Engagement in a Group-Based ACT Intervention for Chronic Pain Management: Highlighting the Need for User-Attainment Focused Digitalised Interventions

Authors: Orestis Kasinopoulos, Maria Karekla, Vasilis Vasiliou, Evangelos Karademas

Abstract:

Acceptance and Commitment Therapy (ACT) is an empirically supported intervention for treating Chronic Pain Patients, yet its effectiveness for some chronic conditions or when adapted to other languages, has not been explored. An ACT group intervention was designed to explore the effectiveness of treating a Greek speaking heterogeneous sample of Chronic Pain patients with the aim of increasing quality of life, acceptance of pain and functionality. Sixty-nine patients were assessed and randomly assigned to an ACT or control group (relaxation techniques) for eight, 90-minute, sessions. Results are currently being analysed and follow-ups (6 and 12 month) are being completed. Low adherence rates and high attrition rates observed in the study, however point to the direction of future modified interventions. Such modifications may include web-based and smartphone interventions and their benefits in being implemented in chronic pain patients.

Keywords: chronic pain, ACT, internet-delivered, digitalised intervention, adherence, attrition

Procedia PDF Downloads 358
24572 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 502