Search results for: data standardization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25247

Search results for: data standardization

25157 The Impact of Different Extra-Linguistic and Intro–Linguistic Factors of Contemporary Albanian Technical Terminology

Authors: Gani Pllana, Sadete Pllana, Albulena Pllana Breznica

Abstract:

The history of appearance and development of technical fields in our country sheds light on the relationships they have entered into with social factors indicating what kinds of factors have prevailed in their appearance and development. Thus, for instance, at the end of the 19th century, a number of knowledge fields were stipulated by political factors, cultural and linguistic factors that are inextricably linked to our nation's efforts to arouse national consciousness through the growth of educational and cultural level of the people. Some sciences, through their fundamental special fields probably would be one of those factors that would accomplish this objective. Other factors were the opening of schools and the drafting of relevant textbooks thereby their accomplishment is to be achieved by means of written language. Therefore the first fundamental knowledge fields were embodied with them, such as mathematics, linguistics, geography.

Keywords: Albanian language, development of terminology, standardization of terminology, technical fields

Procedia PDF Downloads 172
25156 Local Food Movements and Community Building in Turkey

Authors: Derya Nizam

Abstract:

An alternative understanding of "localization" has gained significance as the ecological and social issues associated with the growing pressure of agricultural homogeneity and standardization become more apparent. Through an analysis of a case study on an alternative food networks in Turkey, this research seeks to critically examine the localization movement. The results indicate that the idea of localization helps to create new niche markets by creating place-based labels, but it also strengthens local identities through social networks that connect rural and urban areas. In that context, localization manifests as a commodification movement that appropriates local and cultural values to generate capitalist profit, as well as a grassroots movement that strengthens the resilience of local communities. This research addresses the potential of community development approaches in the democratization of global agro-food networks.

Keywords: community building, local food, alternative food movements, localization

Procedia PDF Downloads 79
25155 Comparative Germination Studies in Mature Seeds of Haloxylon Salicornicum

Authors: Laila Almulla

Abstract:

As native plants are better adapted to the local environment, can endure long spells of drought, withstand high soil salinity levels and provide a more natural effect to landscape projects, their use in landscape projects are gaining popularity. Standardization of seed germination methods and raising the hardened plants of selected native plants for their use in landscape projects will both conserve natural resources and produce sustainable greenery. In the present study, Haloxylon salicornicum, a perennial herb with a potential use for urban greenery was selected for seed germination tests as there is an urgent need to mass multiply them for their large-scale use. Among the nine treatments tried with different concentrations of gibberelic acid (GA3) and dry heat, the seeds responded with treatments when the wings were removed. The control as well as 250 GA3 treatments produced the maximum germination of 86%.

Keywords: dormancy, gibberelic acid, germination trays , vigor index

Procedia PDF Downloads 400
25154 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity

Authors: Hoda A. Abdel Hafez

Abstract:

Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.

Keywords: mining big data, big data, machine learning, telecommunication

Procedia PDF Downloads 409
25153 Standardization of the Roots of Gnidia stenophylla Gilg: A Potential Medicinal Plant of South Eastern Ethiopia Traditionally Used as an Antimalarial

Authors: Mebruka Mohammed, Daniel Bisrat, Asfaw Debella, Tarekegn Birhanu

Abstract:

Lack of quality control standards for medicinal plants and their preparations is considered major barrier to their integration in to effective primary health care in Ethiopia. Poor quality herbal preparations led to countless adverse reactions extending to death. Denial of penetration for the Ethiopian medicinal plants in to the world’s booming herbal market is also another significant loss resulting from absence of herbal quality control system. Thus, in the present study, Gnidia stenophylla Gilg (popular antimalarial plant of south eastern Ethiopia), is standardized and a full monograph is produced that can serve as a guideline in quality control of the crude drug. Morphologically, the roots are found to be cylindrical and tapering towards the end. It has a hard, corky and friable touch with saddle brown color externally and it is relatively smooth and pale brown internally. It has got characteristic pungent odor and very bitter taste. Microscopically it has showed lignified xylem vessels, wider medullary rays with some calcium oxalate crystals, reddish brown secondary metabolite contents and slender shaped long fibres. Physicochemical standards quantified and resulted: foreign matter (5.25%), moisture content (6.69%), total ash (40.80%), acid insoluble ash (8.00%), water soluble ash (2.30%), alcohol soluble extractive (15.27%), water soluble extractive (10.98%), foaming index (100.01 ml/g), swelling index (7.60 ml/g). Phytochemically: Phenols, flavonoids, steroids, tannins and saponins were detected in the root extract; TLC and HPLC fingerprints were produced and an analytical marker was also tentatively characterized as 3-(3,4-dihydro-3,5-dihydroxy-2-(4-hydroxy-5-methylhex-1-en-2-yl)-7-methoxy-4-oxo-2H-chromen-8-yl)-5-hydroxy-2-(4-hydroxyphenyl)-7-methoxy-4H-chromen-4-one. Residue wise pesticides (i.e. DDT, DDE, g-BHC) and radiochemical levels fall below the WHO limit while Heavy metals (i.e. Co, Ni, Cr, Pb, and Cu), total aerobic count and fungal load lie way above the WHO limit. In conclusion, the result can be taken as signal that employing non standardized medicinal plants could cause many health risks of the Ethiopian people and Africans’ at large (as 80% of inhabitants in the continent depends on it for primary health care). Therefore, following a more universal approach to herbal quality by adopting the WHO guidelines and developing monographs using the various quality parameters is inevitable to minimize quality breach and promote effective herbal drug usage.

Keywords: Gnidia stenophylla Gilg, standardization/monograph, pharmacognostic, residue/impurity, quality

Procedia PDF Downloads 289
25152 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 140
25151 Using Machine Learning Techniques to Extract Useful Information from Dark Data

Authors: Nigar Hussain

Abstract:

It is a subset of big data. Dark data means those data in which we fail to use for future decisions. There are many issues in existing work, but some need powerful tools for utilizing dark data. It needs sufficient techniques to deal with dark data. That enables users to exploit their excellence, adaptability, speed, less time utilization, execution, and accessibility. Another issue is the way to utilize dark data to extract helpful information to settle on better choices. In this paper, we proposed upgrade strategies to remove the dark side from dark data. Using a supervised model and machine learning techniques, we utilized dark data and achieved an F1 score of 89.48%.

Keywords: big data, dark data, machine learning, heatmap, random forest

Procedia PDF Downloads 28
25150 Process Optimization and Automation of Information Technology Services in a Heterogenic Digital Environment

Authors: Tasneem Halawani, Yamen Khateeb

Abstract:

With customers’ ever-increasing expectations for fast services provisioning for all their business needs, information technology (IT) organizations, as business partners, have to cope with this demanding environment and deliver their services in the most effective and efficient way. The purpose of this paper is to identify optimization and automation opportunities for the top requested IT services in a heterogenic digital environment and widely spread customer base. In collaboration with systems, processes, and subject matter experts (SMEs), the processes in scope were approached by analyzing four-year related historical data, identifying and surveying stakeholders, modeling the as-is processes, and studying systems integration/automation capabilities. This effort resulted in identifying several pain areas, including standardization, unnecessary customer and IT involvement, manual steps, systems integration, and performance measurement. These pain areas were addressed by standardizing the top five requested IT services, eliminating/automating 43 steps, and utilizing a single platform for end-to-end process execution. In conclusion, the optimization of IT service request processes in a heterogenic digital environment and widely spread customer base is challenging, yet achievable without compromising the service quality and customers’ added value. Further studies can focus on measuring the value of the eliminated/automated process steps to quantify the enhancement impact. Moreover, a similar approach can be utilized to optimize other IT service requests, with a focus on business criticality.

Keywords: automation, customer value, heterogenic, integration, IT services, optimization, processes

Procedia PDF Downloads 107
25149 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 393
25148 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 525
25147 Views of the Self in Beast and Beauty K-Dramas: The South Korean Paradigm of Beauty

Authors: Patricia P. M. C. Lourenço

Abstract:

South Korean Entertainment Industry has reversed the gender binary through Beast and Beauty Korean dramas that perpetuate Korean unrealistic beauty standards by emphasizing freckles, acne, pimples, excessive weight, fizzy hair, glasses, and braces as ugly and unattractive, therefore in need of correction to fit into society’s pre-established beauty mould. This pursuit of physical beauty as a happiness goal only detracts singularity in favour of mundaneness, sustaining the illusion that unsightly women need to undergo a physical transformation to improve their lives while handsome, wealthy men need not do anything more than altruistically accept them for who they really are inside. Five Beast and Beauty dramas were analysed for this paper. The assessment revealed that there is standardization and typecasting of Beast and Beauty roles in K-Dramas, a reflection of South Korean’s patriarchal society where women and men are continuously expected to fulfil their pre-established gender binary roles and stereotypes.

Keywords: K-dramas, beauty, low self-esteem, plastic surgery, South Korean stereotypes

Procedia PDF Downloads 214
25146 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 473
25145 Implications on the Training Program for Clinical Psychologists in South Korea

Authors: Chorom Baek, Sungwon Choi

Abstract:

The purpose of this study is to analyze the supervision system, and the training and continuing education of mental health professionals in USA, UK, Australia (New Zealand), Japan, and so on, and to deduce the implications of Korean mental health service system. In order to accomplish the purpose of this study, following methodologies were adopted: review on the related literatures, statistical data, the related manuals, online materials, and previous studies concerning issues in those countries for the past five years. The training program in Korea was compared with the others’ through this literature analysis. The induced matters were divided with some parts such as training program, continuing education, educational procedure, and curriculum. Based on the analysis, discussion and implications, the conclusion and further suggestion of this study are as follows: First, Korean Clinical Psychology of Association (KCPA) should become more powerful health main training agency for quality control. Second, actual authority of health main training agency should be a grant to training centers. Third, quality control of mental health professionals should be through standardization and systemization of promotion and qualification management. Fourth, education and training about work of supervisors and unification of criteria for supervision should be held. Fifth, the training program for mental health license should be offered by graduate schools. Sixth, legitimated system to protect the right of mental health trainees is needed. Seventh, regularly continuing education after licensed should be compulsory to keep the certification. Eighth, the training program in training centers should meet KCPA requirement. If not, KCPA can cancel the certification of the centers.

Keywords: clinical psychology, Korea, mental health system, training program

Procedia PDF Downloads 226
25144 Cross-Sectional Study Investigating the Prevalence of Uncorrected Refractive Error and Visual Acuity through Mobile Vision Screening in the Homeless in Wales

Authors: Pakinee Pooprasert, Wanxin Wang, Tina Parmar, Dana Ahnood, Tafadzwa Young-Zvandasara, James Morgan

Abstract:

Homelessness has been shown to be correlated to poor health outcomes, including increased visual health morbidity. Despite this, there are relatively few studies regarding visual health in the homeless population, especially in the UK. This research aims to investigate visual disability and access barriers prevalent in the homeless population in Cardiff, South Wales. Data was collected from 100 homeless participants in three different shelters. Visual outcomes included near and distance visual acuity as well as non-cycloplegic refraction. Qualitative data was collected via a questionnaire and included socio-demographic profile, ocular history, subjective visual acuity and level of access to healthcare facilities. Based on the participants’ presenting visual acuity, the total prevalence of myopia and hyperopia was 17.0% and 19.0% respectively based on spherical equivalent from the eye with the greatest absolute value. The prevalence of astigmatism was 8.0%. The mean absolute spherical equivalent was 0.841D and 0.853D for right and left eye respectively. The number of participants with sight loss (as defined by VA= 6/12-6/60 in the better-seeing eye) was 27.0% in comparison to 0.89% and 1.1% in the general Cardiff and Wales population respectively (p-value is < 0.05). Additionally, 1.0% of the homeless subjects were registered blind (VA less than 3/60), in comparison to 0.17% for the national consensus after age standardization. Most participants had good knowledge regarding access to prescription glasses and eye examination services. Despite this, 85.0% never had their eyes examined by a doctor and 73.0% had their last optometrist appointment in more than 5 years. These findings suggested that there was a significant disparity in ocular health, including visual acuity and refractive error amongst the homeless in comparison to the general population. Further, the homeless were less likely to receive the same level of support and continued care in the community due to access barriers. These included a number of socio-economic factors such as travel expenses and regional availability of services, as well as administrative shortcomings. In conclusion, this research demonstrated unmet visual health needs within the homeless, and that inclusive policy changes may need to be implemented for better healthcare outcomes within this marginalized community.

Keywords: homelessness, refractive error, visual disability, Wales

Procedia PDF Downloads 172
25143 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 404
25142 How to Use Big Data in Logistics Issues

Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy

Abstract:

Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.

Keywords: big data, logistics, operational efficiency, risk management

Procedia PDF Downloads 641
25141 Statistical Modeling for Permeabilization of a Novel Yeast Isolate for β-Galactosidase Activity Using Organic Solvents

Authors: Shweta Kumari, Parmjit S. Panesar, Manab B. Bera

Abstract:

The hydrolysis of lactose using β-galactosidase is one of the most promising biotechnological applications, which has wide range of potential applications in food processing industries. However, due to intracellular location of the yeast enzyme, and expensive extraction methods, the industrial applications of enzymatic hydrolysis processes are being hampered. The use of permeabilization technique can help to overcome the problems associated with enzyme extraction and purification of yeast cells and to develop the economically viable process for the utilization of whole cell biocatalysts in food industries. In the present investigation, standardization of permeabilization process of novel yeast isolate was carried out using a statistical model approach known as Response Surface Methodology (RSM) to achieve maximal b-galactosidase activity. The optimum operating conditions for permeabilization process for optimal β-galactosidase activity obtained by RSM were 1:1 ratio of toluene (25%, v/v) and ethanol (50%, v/v), 25.0 oC temperature and treatment time of 12 min, which displayed enzyme activity of 1.71 IU /mg DW.

Keywords: β-galactosidase, optimization, permeabilization, response surface methodology, yeast

Procedia PDF Downloads 254
25140 Implementation of an IoT Sensor Data Collection and Analysis Library

Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee

Abstract:

Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.

Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data

Procedia PDF Downloads 378
25139 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 162
25138 Classification System for Soft Tissue Injuries of Face: Bringing Objectiveness to Injury Severity

Authors: Garg Ramneesh, Uppal Sanjeev, Mittal Rajinder, Shah Sheerin, Jain Vikas, Singla Bhupinder

Abstract:

Introduction: Despite advances in trauma care, a classification system for soft tissue injuries of the face still needs to be objectively defined. Aim: To develop a classification system for soft tissue injuries of the face; that is objective, easy to remember, reproducible, universally applicable, aids in surgical management and helps to develop a structured data that can be used for future use. Material and Methods: This classification system includes those patients that need surgical management of facial injuries. Associated underlying bony fractures have been intentionally excluded. Depending upon the severity of soft tissue injury, these can be graded from 0 to IV (O-Abrasions, I-lacerations, II-Avulsion injuries with no skin loss, III-Avulsion injuries with skin loss that would need graft or flap cover, and IV-complex injuries). Anatomically, the face has been divided into three zones (Zone 1/2/3), as per aesthetic subunits. Zone 1e stands for injury of eyebrows; Zones 2 a/b/c stand for nose, upper eyelid and lower eyelid respectively; Zones 3 a/b/c stand for upper lip, lower lip and cheek respectively. Suffices R and L stand for right or left involved side, B for presence of foreign body like glass or pellets, C for extensive contamination and D for depth which can be graded as D 1/2/3 if depth is still fat, muscle or bone respectively. I is for damage to facial nerve or parotid duct. Results and conclusions: This classification system is easy to remember, clinically applicable and would help in standardization of surgical management of soft tissue injuries of face. Certain inherent limitations of this classification system are inability to classify sutured wounds, hematomas and injuries along or against Langer’s lines.

Keywords: soft tissue injuries, face, avulsion, classification

Procedia PDF Downloads 383
25137 Hierarchical Control Structure to Control the Power Distribution System Components in Building Systems

Authors: Hamed Sarbazy, Zohre Gholipour Haftkhani, Ali Safari, Pejman Hosseiniun

Abstract:

Scientific and industrial progress in the past two decades has resulted in energy distribution systems based on power electronics, as an enabling technology in various industries and building management systems can be considered. Grading and standardization module power electronics systems and its use in a distributed control system, a strategy for overcoming the limitations of using this system. The purpose of this paper is to investigate strategies for scheduling and control structure of standard modules is a power electronic systems. This paper introduces the classical control methods and disadvantages of these methods will be discussed, The hierarchical control as a mechanism for distributed control structure of the classification module explains. The different levels of control and communication between these levels are fully introduced. Also continue to standardize software distribution system control structure is discussed. Finally, as an example, the control structure will be presented in a DC distribution system.

Keywords: application management, hardware management, power electronics, building blocks

Procedia PDF Downloads 521
25136 Government Big Data Ecosystem: A Systematic Literature Review

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.

Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review

Procedia PDF Downloads 228
25135 A Machine Learning Decision Support Framework for Industrial Engineering Purposes

Authors: Anli Du Preez, James Bekker

Abstract:

Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.

Keywords: Data analytics, Industrial engineering, Machine learning, Value creation

Procedia PDF Downloads 168
25134 Status of Hazardous Waste Generation and Its Impacts on Environment and Human Health: A Study in West Bengal

Authors: Sk Ajim Ali

Abstract:

The present study is an attempt to overview on the major environmental and health impacts due to hazardous waste generation and poor management. In present scenario, not only hazardous waste, but as a common term ‘Waste’ is one of the acceptable and thinkable environmental issues. With excessive increasing population, industrialization and standardization of human’s life style heap in extra waste generation which is directly or indirectly related with hazardous waste generation. Urbanization and population growth are solely responsible for establishing industrial sector and generating various Hazardous Waste (HW) and concomitantly poor management practice arising adverse effect on environment and human health. As compare to other Indian state, West Bengal is not too much former in HW generation. West Bengal makes a rank of 7th in HW generation followed by Maharashtra, Gujarat, Tamil Nadu, U.P, Punjab and Andhra Pradesh. During the last 30 years, the industrial sectors in W.B have quadrupled in size, during 1995 there were only 440 HW generating Units in West Bengal which produced 129826 MTA hazardous waste but in 2011, it rose up into 609 units and it produced about 259777 MTA hazardous waste. So, the notable thing is that during a 15 year interval there increased 169 waste generating units but it produced about 129951 MTA of hazardous waste. Major chemical industries are the main sources of HW and causes of adverse effect on the environment and human health. HW from industrial sectors contains heavy metals, cyanides, pesticides, complex aromatic compounds (i.e. PCB) and other chemical which are toxic, flammable, reactive, and corrosive and have explosive properties which highly affect the surrounding environment and human health in and around he disposal sites. The main objective of present study is to highlight on the sources and components of hazardous waste in West Bengal and impacts of improper HW management on health and environment. This study is carried out based on a secondary source of data and qualitative method of research. The secondary data has been collected annual report of WBPCB, WHO’s report, research paper, article, books and so on. It has been found that excessive HW generation from various sources and communities has serious health hazards that lead to the spreading of infectious disease and environmental change.

Keywords: environmental impacts, existing HW generation and management practice, hazardous waste (HW), health impacts, recommendation and planning

Procedia PDF Downloads 284
25133 Stage-Gate Framework Application for Innovation Assessment among Small and Medium-Sized Enterprises

Authors: Indre Brazauskaite, Vilte Auruskeviciene

Abstract:

The paper explores the Stage-Gate framework application for innovation maturity among small and medium-sized enterprises (SMEs). Innovation management becomes an essential business survival process for all sizes of organizations that can be evaluated and audited systemically. This research systemically defines and assesses the innovation process from the perspective of the company’s top management. Empirical research explores attitudes and existing practices of innovation management in SMEs in Baltic countries. It structurally investigates the current innovation management practices, level of standardization, and potential challenges in the area. Findings allow to structure of existing practices based on an institutionalized model and contribute to a more advanced understanding of the innovation process among SMEs. Practically, findings contribute to advanced decision-making and business planning in the process.

Keywords: innovation measure, innovation process, SMEs, stage-gate framework

Procedia PDF Downloads 98
25132 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm

Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima

Abstract:

In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.

Keywords: cloud space, AES, FTP, NetBeans IDE

Procedia PDF Downloads 206
25131 Circular Bio-economy of Copper and Gold from Electronic Wastes

Authors: Sadia Ilyas, Hyunjung Kim, Rajiv R. Srivastava

Abstract:

Current work has attempted to establish the linkages between circular bio-economy and recycling of copper and gold from urban mine by applying microbial activities instead of the smelter and chemical technologies. Thereafter, based on the potential of microbial approaches and research hypothesis, the structural model has been tested for a significance level of 99%, which is supported by the corresponding standardization co-efficient values. A prediction model applied to determine the recycling impact on circular bio-economy indicates to re-circulate 51,833 tons of copper and 58 tons of gold by 2030 for the production of virgin metals/raw-materials, while recycling rate of the accumulated e-waste remains to be 20%. This restoration volume of copper and gold through the microbial activities corresponds to mitigate 174 million kg CO₂ emissions and 24 million m³ water consumption if compared with the primary production activities. The study potentially opens a new window for environmentally-friendly biotechnological recycling of e-waste urban mine under the umbrella concept of circular bio-economy.

Keywords: urban mining, biobleaching, circular bio-economy, environmental impact

Procedia PDF Downloads 157
25130 Standardization Of Miniature Neutron Research Reactor And Occupational Safety Analysis

Authors: Raymond Limen Njinga

Abstract:

The comparator factors (Fc) for miniature research reactors are of great importance in the field of nuclear physics as it provide accurate bases for the evaluation of elements in all form of samples via ko-NAA techniques. The Fc was initially simulated theoretically thereafter, series of experiments were performed to validate the results. In this situation, the experimental values were obtained using the alloy of Au(0.1%) - Al monitor foil and a neutron flux setting of 5.00E+11 cm-2.s-1. As was observed in the inner irradiation position, the average experimental value of 7.120E+05 was reported against the theoretical value of 7.330E+05. In comparison, a percentage deviation of 2.86 (from theoretical value) was observed. In the large case of the outer irradiation position, the experimental value of 1.170E+06 was recorded against the theoretical value of 1.210E+06 with a percentage deviation of 3.310 (from the theoretical value). The estimation of equivalent dose rate at 5m from neutron flux of 5.00E+11 cm-2.s-1 within the neutron energies of 1KeV, 10KeV, 100KeV, 500KeV, 1MeV, 5MeV and 10MeV were calculated to be 0.01 Sv/h, 0.01 Sv/h, 0.03 Sv/h, 0.15 Sv/h, 0.21Sv/h and 0.25 Sv/h respectively with a total dose within a period of an hour was obtained to be 0.66 Sv.

Keywords: neutron flux, comparator factor, NAA techniques, neutron energy, equivalent dose

Procedia PDF Downloads 182
25129 Business Intelligence for Profiling of Telecommunication Customer

Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro

Abstract:

Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.

Keywords: business intelligence, customer segmentation, data warehouse, data mining

Procedia PDF Downloads 483
25128 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network

Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse

Abstract:

Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.

Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM

Procedia PDF Downloads 236