Search results for: data quality management
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11210

Search results for: data quality management

10610 Intellectual Capital and Competitive Advantage: An Analysis of the Biotechnology Industry

Authors: Campisi Domenico, Costa Roberta

Abstract:

Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.

Keywords: Data Envelopment Analysis, Innovation, Intangible assets, Intellectual Capital, Malmquist Productivity Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
10609 Analysis on the Feasibility of Landsat 8 Imagery for Water Quality Parameters Assessment in an Oligotrophic Mediterranean Lake

Authors: V. Markogianni, D. Kalivas, G. Petropoulos, E. Dimitriou

Abstract:

Lake water quality monitoring in combination with the use of earth observation products constitutes a major component in many water quality monitoring programs. Landsat 8 images of Trichonis Lake (Greece) acquired on 30/10/2013 and 30/08/2014 were used in order to explore the possibility of Landsat 8 to estimate water quality parameters and particularly CDOM absorption at specific wavelengths, chlorophyll-a and nutrient concentrations in this oligotrophic freshwater body, characterized by inexistent quantitative, temporal and spatial variability. Water samples have been collected at 22 different stations, on late August of 2014 and the satellite image of the same date was used to statistically correlate the in-situ measurements with various combinations of Landsat 8 bands in order to develop algorithms that best describe those relationships and calculate accurately the aforementioned water quality components. Optimal models were applied to the image of late October of 2013 and the validation of the results was conducted through their comparison with the respective available in-situ data of 2013. Initial results indicated the limited ability of the Landsat 8 sensor to accurately estimate water quality components in an oligotrophic waterbody. As resulted by the validation process, ammonium concentrations were proved to be the most accurately estimated component (R = 0.7), followed by chl-a concentration (R = 0.5) and the CDOM absorption at 420 nm (R = 0.3). In-situ nitrate, nitrite, phosphate and total nitrogen concentrations of 2014 were measured as lower than the detection limit of the instrument used, hence no statistical elaboration was conducted. On the other hand, multiple linear regression among reflectance measures and total phosphorus concentrations resulted in low and statistical insignificant correlations. Our results were concurrent with other studies in international literature, indicating that estimations for eutrophic and mesotrophic lakes are more accurate than oligotrophic, owing to the lack of suspended particles that are detectable by satellite sensors. Nevertheless, although those predictive models, developed and applied to Trichonis oligotrophic lake are less accurate, may still be useful indicators of its water quality deterioration.

Keywords: Landsat 8, oligotrophic lake, remote sensing, water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
10608 Ballast Water Management Triad: Administration, Ship Owner and the Seafarer

Authors: Rajoo Balaji, Omar Yaakob

Abstract:

The Ballast Water Convention requires less than 5% of the world tonnage for ratification. Consequently, ships will have to comply with the requirements. Compliance evaluation and enforcement will become mandatory. Ship owners have to invest in treatment systems and shipboard personnel have to operate them and ensure compliance. The monitoring and enforcement will be the responsibilities of the Administrations. Herein, a review of the current status of the Ballast Water Management and the issues faced by these are projected. Issues range from efficacy and economics of the treatment systems to sampling and testing. Health issues of chemical systems, paucity of data for decision support etc., are other issues. It is emphasized that management of ballast water must be extended to ashore and sustainable solutions must be researched upon. An exemplar treatment system based on ship’s waste heat is also suggested.

Keywords: Ballast water management, Compliance evaluation, Compliance enforcement, Sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2182
10607 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection

Authors: Hamidullah Binol, Abdullah Bal

Abstract:

Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.

Keywords: Food (Ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
10606 Comparison of Compression Ability Using DCT and Fractal Technique on Different Imaging Modalities

Authors: Sumathi Poobal, G. Ravindran

Abstract:

Image compression is one of the most important applications Digital Image Processing. Advanced medical imaging requires storage of large quantities of digitized clinical data. Due to the constrained bandwidth and storage capacity, however, a medical image must be compressed before transmission and storage. There are two types of compression methods, lossless and lossy. In Lossless compression method the original image is retrieved without any distortion. In lossy compression method, the reconstructed images contain some distortion. Direct Cosine Transform (DCT) and Fractal Image Compression (FIC) are types of lossy compression methods. This work shows that lossy compression methods can be chosen for medical image compression without significant degradation of the image quality. In this work DCT and Fractal Compression using Partitioned Iterated Function Systems (PIFS) are applied on different modalities of images like CT Scan, Ultrasound, Angiogram, X-ray and mammogram. Approximately 20 images are considered in each modality and the average values of compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the reconstructed image is arrived by the PSNR values. Based on the results it can be concluded that the DCT has higher PSNR values and FIC has higher compression ratio. Hence in medical image compression, DCT can be used wherever picture quality is preferred and FIC is used wherever compression of images for storage and transmission is the priority, without loosing picture quality diagnostically.

Keywords: DCT, FIC, PIFS, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814
10605 Real Time Approach for Data Placement in Wireless Sensor Networks

Authors: Sanjeev Gupta, Mayank Dave

Abstract:

The issue of real-time and reliable report delivery is extremely important for taking effective decision in a real world mission critical Wireless Sensor Network (WSN) based application. The sensor data behaves differently in many ways from the data in traditional databases. WSNs need a mechanism to register, process queries, and disseminate data. In this paper we propose an architectural framework for data placement and management. We propose a reliable and real time approach for data placement and achieving data integrity using self organized sensor clusters. Instead of storing information in individual cluster heads as suggested in some protocols, in our architecture we suggest storing of information of all clusters within a cell in the corresponding base station. For data dissemination and action in the wireless sensor network we propose to use Action and Relay Stations (ARS). To reduce average energy dissipation of sensor nodes, the data is sent to the nearest ARS rather than base station. We have designed our architecture in such a way so as to achieve greater energy savings, enhanced availability and reliability.

Keywords: Cluster head, data reliability, real time communication, wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1806
10604 Machine Scoring Model Using Data Mining Techniques

Authors: Wimalin S. Laosiritaworn, Pongsak Holimchayachotikul

Abstract:

this article proposed a methodology for computer numerical control (CNC) machine scoring. The case study company is a manufacturer of hard disk drive parts in Thailand. In this company, sample of parts manufactured from CNC machine are usually taken randomly for quality inspection. These inspection data were used to make a decision to shut down the machine if it has tendency to produce parts that are out of specification. Large amount of data are produced in this process and data mining could be very useful technique in analyzing them. In this research, data mining techniques were used to construct a machine scoring model called 'machine priority assessment model (MPAM)'. This model helps to ensure that the machine with higher risk of producing defective parts be inspected before those with lower risk. If the defective prone machine is identified sooner, defective part and rework could be reduced hence improving the overall productivity. The results showed that the proposed method can be successfully implemented and approximately 351,000 baht of opportunity cost could have saved in the case study company.

Keywords: Computer Numerical Control, Data Mining, HardDisk Drive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
10603 Why do Clawback Provisions Affect Financial Reporting Quality? - An Analysis of Trigger Effects

Authors: Yu-Chun Lin

Abstract:

We identify clawback triggers from firms- proxy statements (Form DEF 14A) and use the likelihood of restatements to proxy for financial reporting quality. Based on a sample of 578 U.S. firms that voluntarily adopt clawback provisions during 2003-2009, when restatement-based triggers could be decomposed into two types: fraud and unintentional error, and we do observe the evidence that using fraud triggers is associated with high financial reporting quality. The findings support that fraud triggers can enhance deterrent effect of clawback provision by establishing a viable disincentive against fraud, misconduct, and otherwise harmful acts. These results are robust to controlling for the compensation components, to different sample specifications and to a number of sensitivity.

Keywords: Accruals quality, Clawback provisions, Compensation, Restatements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
10602 Survey of Key Management Algorithms in WiMAX

Authors: R. Chithra, B. Kalavathi, J. Christy Lavanya

Abstract:

WiMAX is a telecommunications technology and it is specified by the Institute of Electrical and Electronics Engineers Inc., as the IEEE 802.16 standard. The goal of this technology is to provide a wireless data over long distances in a variety of ways. IEEE 802.16 is a recent standard for mobile communication. In this paper, we provide an overview of various key management algorithms to provide security for WiMAX.

Keywords: Broadcast, Rekeying, Scalability, Secrecy, Unicast, WiMAX.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2025
10601 Facilitating Factors for the Success of Mobile Service Providers in Bangkok Metropolitan

Authors: Yananda Siraphatthada

Abstract:

The objectives of this research were to study the level of influencing factors, leadership, supply chain management, innovation, competitive advantages, business success, and affecting factors to the business success of the mobile phone system service providers in Bangkok Metropolitan. This research was done by the quantitative approach and the qualitative approach. The quantitative approach was used for questionnaires to collect data from the 331 mobile service shop managers franchised by AIS, Dtac and TrueMove. The mobile phone system service providers/shop managers were randomly stratified and proportionally allocated into subgroups exclusive to the number of the providers in each network. In terms of qualitative method, there were in-depth interviews of 6 mobile service providers/managers of Telewiz and Dtac and TrueMove shop to find the agreement or disagreement with the content analysis method. Descriptive Statistics, including Frequency, Percentage, Means and Standard Deviation were employed; also, the Structural Equation Model (SEM) was used as a tool for data analysis. The content analysis method was applied to identify key patterns emerging from the interview responses. The two data sets were brought together for comparing and contrasting to make the findings, providing triangulation to enrich result interpretation. It revealed that the level of the influencing factors – leadership, innovation management, supply chain management, and business competitiveness had an impact at a great level, but that the level of factors, innovation and the business, financial success and nonbusiness financial success of the mobile phone system service providers in Bangkok Metropolitan, is at the highest level. Moreover, the business influencing factors, competitive advantages in the business of mobile system service providers which were leadership, supply chain management, innovation management, business advantages, and business success, had statistical significance at .01 which corresponded to the data from the interviews.

Keywords: Business success, mobile service providers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1092
10600 Optimization of Air Pollution Control Model for Mining

Authors: Zunaira Asif, Zhi Chen

Abstract:

The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.

Keywords: Air pollution, linear programming, mining, optimization, treatment technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
10599 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
10598 Data Transmission Reliability in Short Message Integrated Distributed Monitoring Systems

Authors: Sui Xin, Li Chunsheng, Tian Di

Abstract:

Short message integrated distributed monitoring systems (SM-DMS) are growing rapidly in wireless communication applications in various areas, such as electromagnetic field (EMF) management, wastewater monitoring, and air pollution supervision, etc. However, delay in short messages often makes the data embedded in SM-DMS transmit unreliably. Moreover, there are few regulations dealing with this problem in SMS transmission protocols. In this study, based on the analysis of the command and data requirements in the SM-DMS, we developed a processing model for the control center to solve the delay problem in data transmission. Three components of the model: the data transmission protocol, the receiving buffer pool method, and the timer mechanism were described in detail. Discussions on adjusting the threshold parameter in the timer mechanism were presented for the adaptive performance during the runtime of the SM-DMS. This model optimized the data transmission reliability in SM-DMS, and provided a supplement to the data transmission reliability protocols at the application level.

Keywords: Delay, SMS, reliability, distributed monitoringsystem (DMS), wireless communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
10597 ERP Implementation in Iran: (A Successful Experience in DGC)

Authors: Mohammad Reza Ostad Ali Naghi Kashani

Abstract:

Nowadays, the amounts of companies which tend to have an Enterprise Resource Planning (ERP) application are increasing. Although ERP projects are expensive, time consuming, and complex, there are some successful experiences. These days, developing countries are striving to implement ERP projects successfully; however, there are many obstacles. Therefore, these projects would be failed or partially failed. This paper concerns the implementation of a successful ERP implementation, IFS, in Iran at Dana Geophysics Company (DGC). After a short review of ERP and ERP market in Iran, we propose a three phases deployment methodology (phase 1: Preparation and Business Process Management (BPM) phase 2: implementation and phase 3: testing, golive-1 (pilot) and golive-2 (final)). Then, we present five guidelines (Project Management, Change Management, Business Process Management (BPM), Training& Knowledge Management, and Technical Management), which were chose as work streams. In this case study we present lessons learned in Project management and Business process Management.

Keywords: Business Process Management, Critical Success Factors, ERP, Project Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2935
10596 Resource Leveling Optimization in Construction Projects of High Voltage Substations Using Nature-Inspired Intelligent Evolutionary Algorithms

Authors: Dimitrios Ntardas, Alexandros Tzanetos, Georgios Dounias

Abstract:

High Voltage Substations (HVS) are the intermediate step between production of power and successfully transmitting it to clients, making them one of the most important checkpoints in power grids. Nowadays - renewable resources and consequently distributed generation are growing fast, the construction of HVS is of high importance both in terms of quality and time completion so that new energy producers can quickly and safely intergrade in power grids. The resources needed, such as machines and workers, should be carefully allocated so that the construction of a HVS is completed on time, with the lowest possible cost (e.g. not spending additional cost that were not taken into consideration, because of project delays), but in the highest quality. In addition, there are milestones and several checkpoints to be precisely achieved during construction to ensure the cost and timeline control and to ensure that the percentage of governmental funding will be granted. The management of such a demanding project is a NP-hard problem that consists of prerequisite constraints and resource limits for each task of the project. In this work, a hybrid meta-heuristic method is implemented to solve this problem. Meta-heuristics have been proven to be quite useful when dealing with high-dimensional constraint optimization problems. Hybridization of them results in boost of their performance.

Keywords: High voltage substations, nature-inspired algorithms, project management, meta-heuristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204
10595 Dataset Analysis Using Membership-Deviation Graph

Authors: Itgel Bayarsaikhan, Jimin Lee, Sejong Oh

Abstract:

Classification is one of the primary themes in computational biology. The accuracy of classification strongly depends on quality of a dataset, and we need some method to evaluate this quality. In this paper, we propose a new graphical analysis method using 'Membership-Deviation Graph (MDG)' for analyzing quality of a dataset. MDG represents degree of membership and deviations for instances of a class in the dataset. The result of MDG analysis is used for understanding specific feature and for selecting best feature for classification.

Keywords: feature, classification, machine learning algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434
10594 Location Management in Cellular Networks

Authors: Bhavneet Sidhu, Hardeep Singh

Abstract:

Cellular networks provide voice and data services to the users with mobility. To deliver services to the mobile users, the cellular network is capable of tracking the locations of the users, and allowing user movement during the conversations. These capabilities are achieved by the location management. Location management in mobile communication systems is concerned with those network functions necessary to allow the users to be reached wherever they are in the network coverage area. In a cellular network, a service coverage area is divided into smaller areas of hexagonal shape, referred to as cells. The cellular concept was introduced to reuse the radio frequency. Continued expansion of cellular networks, coupled with an increasingly restricted mobile spectrum, has established the reduction of communication overhead as a highly important issue. Much of this traffic is used in determining the precise location of individual users when relaying calls, with the field of location management aiming to reduce this overhead through prediction of user location. This paper describes and compares various location management schemes in the cellular networks.

Keywords: Cellular Networks, Location Area, MobilityManagement, Paging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4006
10593 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
10592 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization

Authors: Hebberly Ahatlan

Abstract:

The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, Information Technology (IT) and Operational Technology (OT) convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.

Keywords: Digitalization, IT/OT convergence, semantic interoperability, TEIA alliance, VPP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94
10591 Model of Community Management for Sustainable Utilization

Authors: Luedech Girdwichai, Witthaya Mekhum

Abstract:

This research intended to develop the model of community management for sustainable utilization by investigating on 2 groups of population, the family heads and the community management team. The population of the former group consisted of family heads from 511 families in 12 areas to complete the questionnaires which were returned at 479 sets. The latter group consisted of the community management team of 12 areas with 1 representative from each area to give the interview. The questionnaires for the family heads consisted of 2 main parts; general information such as occupations, etc. in the form of checklist. The second part dealt with the data on self reliance community development based on 4P Framework, i.e., People (human resource) development, Place (area) development, Product (economic and income source) development, and Plan (community plan) development in the form of rating scales. Data in the 1st part were calculated to find frequency and percentage while those in the 2nd part were analyzed to find arithmetic mean and SD. Data from the 2nd group of population or the community management team were derived from focus group to find factors influencing successful management together with the in depth interview which were analyzed by descriptive statistics. The results showed that 479 family heads reported that the aspect on the implementation of community plan to self reliance community activities based on Sufficient Economy Philosophy and the 4P was at the average of 3.28 or moderate level. When considering in details, it was found that the 1st aspect was on the area development with the mean of 3.71 or high level followed by human resource development with the mean of 3.44 or moderate level, then, economic and source of income development with the mean of 3.09 or moderate level. The last aspect was community plan development with the mean of 2.89. The results from the small group discussion revealed some factors and guidelines for successful community management as follows: 1) on the People (human resource) development aspect, there was a project to support and develop community leaders. 2) On the aspect of Place (area) development, there was a development on conservative tourism areas. 3) On the aspect of Product (economic and source of income) development, the community leaders promoted the setting of occupational group, saving group, and product processing group. 4) On the aspect of Plan (community plan) development, there was a prioritization through public hearing.

Keywords: Model of community management, sustainable utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
10590 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: Data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 497
10589 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction

Authors: S. Anastasiou, C. Nathanailides

Abstract:

The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services of banks were gathered from relevant published works which included data from five different countries. The scores of customers and employees satisfaction of the different published works were transformed and normalized to the scale of 1 to 100. The data were analyzed and a regression analysis of the two parameters was used to describe the link between employee’s satisfaction and customer’s satisfaction. Assuming that employee satisfaction has a significant influence on customer’s service and the resulting customer satisfaction, the reviewed data indicate that employee’s satisfaction contributes significantly on the level of customer satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05). The reviewed data indicate that published data support the hypothesis that practical evidence link these two parameters. During the recent global economic crisis, the financial services sector was affected severely and job security, remuneration and recruitment of personnel of banks was in many countries, including Greece, significantly reduced. Nevertheless, modern organizations should always consider their personnel as a capital, which is the driving force for success in the future. Appropriate human resource management policies can increase the level of job satisfaction of the personnel with positive consequences for the level of customer’s satisfaction.

Keywords: Job satisfaction, job performance, customer service, banks, human resources management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5110
10588 Long Term Changes of Water Quality in Latvia

Authors: Maris Klavins, Valery Rodinov

Abstract:

The aim of this study was to analyze long term changes of surface water quality in Latvia, spatial variability of water chemical composition, possible impacts of different pollution sources as well as to analyze the measures to protect national water resources - river basin management. Within this study, the concentrations of major water ingredients and microelements in major rivers and lakes of Latvia have been determined. Metal concentrations in river and lake waters were compared with water chemical composition. The mean concentrations of trace metals in inland waters of Latvia are appreciably lower than the estimated world averages for river waters and close to or lower than background values, unless regional impacts determined by local geochemistry. This may be explained by a comparatively lower level of anthropogenic load. In the same time in several places, direct anthropogenic impacts are evident, regarding influences of point sources both transboundary transport impacts. Also, different processes related to pollution of surface waters in Latvia have been analyzed. At first the analysis of changes and composition of pollutant emissions in Latvia has been realized, and the obtained results were compared with actual composition of atmospheric precipitation and their changes in time.

Keywords: Water quality, trend analysis, pollution, human impact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 987
10587 Baking Quality of Hulled Wheat Species in Organic Farming

Authors: P. Konvalina, I. Capouchová, Z. Stehno

Abstract:

The organic farmers use wider range of crop varieties than the conventional farming. Bread wheat is the most favorite and the most common food crop. The organic bread wheat is usually of worse technological quality. Therefore, it is supposed to be an attractive alternative to the hulled wheat species (einkorn, emmer wheat and spelt). Twenty-five hulled bread wheat varieties and control bread wheat ones were grown on the certified organic parcel in České Budějovice (the Czech Republic) between 2009 and 2012. Their baking quality was measured and evaluated with standard methods, and in accordance with ICC. The results have shown that the grain of hulled wheat varieties contain a lot of proteins in grains (up to 18 percent); even the organic hulled bread wheat varieties are characterized by such good baking quality. Einkorn and emmer wheat are of worse technological quality of proteins (low values of gluten index and Zeleny test), which is a disadvantage of these two wheat species. On the other hand, spelt wheat is of better technological quality and is similar to the control bread wheat varieties. Mixtures consisting of bread wheat, among others, are considered good alternatives; they may contribute to wider range of use of the hulled wheat species. It is one of the possibilities which may increase the proportion of proteins in bread wheat grains; the nutrition-rich hulled wheat grains may be also used in such way at the same time.

Keywords: Baking quality, organic farming, einkorn, emmer wheat, spelt.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
10586 A Survey Proposal towards Holistic Management of Schizophrenia

Authors: Pronab Ganguly, Ahmed A. Moustafa

Abstract:

Holistic management of schizophrenia involves mainstream pharmacological intervention, complimentary medicine intervention, therapeutic intervention and other psychosocial factors such as accommodation, education, job training, employment, relationship, friendship, exercise, overall well-being, smoking, substance abuse, suicide prevention, stigmatisation, recreation, entertainment, violent behaviour, arrangement of public trusteeship and guardianship, day-day-living skill, integration with community, and management of overweight due to medications and other health complications related to medications amongst others. Our review shows that there is no integrated survey by combining all these factors. An international web-based survey was conducted to evaluate the significance of all these factors and present them in a unified manner. It is believed this investigation will contribute positively towards holistic management of schizophrenia. There will be two surveys. In the pharmacological intervention survey, five popular drugs for schizophrenia will be chosen and their efficacy as well as harmful side effects will be evaluated on a scale of 0 -10. This survey will be done by psychiatrists. In the second survey, each element of therapeutic intervention and psychosocial factors will be evaluated according to their significance on a scale of 0 - 10. This survey will be done by care givers, psychologists, case managers and case workers. For the first survey, professional bodies of psychiatrists in English speaking countries will be contacted to request them to ask their members to participate in the survey. For the second survey, professional bodies of clinical psychologist and care givers in English speaking countries will be contacted to request them to ask their members to participate in the survey. Additionally, for both the surveys, relevant professionals will be contacted through personal contact networks. For both the surveys, mean, mode, median, standard deviation and net promoter score will be calculated for each factor and then presented in a statistically significant manner. Subsequently each factor will be ranked according to their statistical significance. Additionally, country specific variation will be highlighted to identify the variation pattern. The results of these surveys will identify the relative significance of each type of pharmacological intervention, each type of therapeutic intervention and each type of psychosocial factor. The determination of this relative importance will definitely contribute to the improvement in quality of life for individuals with schizophrenia.

Keywords: Schizophrenia, holistic management, antipsychotics, quality of life.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 822
10585 Exploring Inter-Relationships between Events to Identify Strategic Technological Competencies: A Combined Approach

Authors: Cláudio Santos, Madalena Araújo, Nuno Correia

Abstract:

The inherent complexity in nowadays- business environments is forcing organizations to be attentive to the dynamics in several fronts. Therefore, the management of technological innovation is continually faced with uncertainty about the future. These issues lead to a need for a systemic perspective, able to analyze the consequences of interactions between different factors. The field of technology foresight has proposed methods and tools to deal with this broader perspective. In an attempt to provide a method to analyze the complex interactions between events in several areas, departing from the identification of the most strategic competencies, this paper presents a methodology based on the Delphi method and Quality Function Deployment. This methodology is applied in a sheet metal processing equipment manufacturer, as a case study.

Keywords: Competencies, Delphi Method, Quality Function Deployment, Technology Foresight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686
10584 A Novel Metric for Performance Evaluation of Image Fusion Algorithms

Authors: Nedeljko Cvejic, Artur Łoza, David Bull, Nishan Canagarajah

Abstract:

In this paper, we present a novel objective nonreference performance assessment algorithm for image fusion. It takes into account local measurements to estimate how well the important information in the source images is represented by the fused image. The metric is based on the Universal Image Quality Index and uses the similarity between blocks of pixels in the input images and the fused image as the weighting factors for the metrics. Experimental results confirm that the values of the proposed metrics correlate well with the subjective quality of the fused images, giving a significant improvement over standard measures based on mean squared error and mutual information.

Keywords: Fusion performance measures, image fusion, non-reference quality measures, objective quality measures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2829
10583 Random Projections for Dimensionality Reduction in ICA

Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi

Abstract:

In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.

Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878
10582 Urban Waste Water Governance in South Africa: A Case Study of Stellenbosch

Authors: R. Malisa, E. Schwella, K. I. Theletsane

Abstract:

Due to climate change, population growth and rapid urbanization, the demand for water in South Africa is inevitably surpassing supply. To address similar challenges globally, there has been a paradigm shift from conventional urban waste water management “government” to a “governance” paradigm. From the governance paradigm, Integrated Urban Water Management (IUWM) principle emerged. This principle emphasizes efficient urban waste water treatment and production of high-quality recyclable effluent. In so doing mimicking natural water systems, in their processes of recycling water efficiently, and averting depletion of natural water resources.  The objective of this study was to investigate drivers of shifting the current urban waste water management approach from a “government” paradigm towards “governance”. The study was conducted through Interactive Management soft systems research methodology which follows a qualitative research design. A case study methodology was employed, guided by realism research philosophy. Qualitative data gathered were analyzed through interpretative structural modelling using Concept Star for Professionals Decision-Making tools (CSPDM) version 3.64.  The constructed model deduced that the main drivers in shifting the Stellenbosch municipal urban waste water management towards IUWM “governance” principles are mainly social elements characterized by overambitious expectations of the public on municipal water service delivery, mis-interpretation of the constitution on access to adequate clean water and sanitation as a human right and perceptions on recycling water by different communities. Inadequate public participation also emerged as a strong driver. However, disruptive events such as draught may play a positive role in raising an awareness on the value of water, resulting in a shift on the perceptions on recycled water. Once the social elements are addressed, the alignment of governance and administration elements towards IUWM are achievable. Hence, the point of departure for the desired paradigm shift is the change of water service authorities and serviced communities’ perceptions and behaviors towards shifting urban waste water management approaches from “government” to “governance” paradigm.

Keywords: Integrated urban water management, urban water system, waste water governance, waste water treatment works.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078
10581 Municipal Solid Waste Management Using Life Cycle Assessment Approach: Case Study of Maku City, Iran

Authors: L. Heidari, M. Jalili Ghazizade

Abstract:

This paper aims to determine the best environmental and economic scenario for Municipal Solid Waste (MSW) management of the Maku city by using Life Cycle Assessment (LCA) approach. The functional elements of this study are collection, transportation, and disposal of MSW in Maku city. Waste composition and density, as two key parameters of MSW, have been determined by field sampling, and then, the other important specifications of MSW like chemical formula, thermal energy and water content were calculated. These data beside other information related to collection and disposal facilities are used as a reliable source of data to assess the environmental impacts of different waste management options, including landfills, composting, recycling and energy recovery. The environmental impact of MSW management options has been investigated in 15 different scenarios by Integrated Waste Management (IWM) software. The photochemical smog, greenhouse gases, acid gases, toxic emissions, and energy consumption of each scenario are measured. Then, the environmental indices of each scenario are specified by weighting these parameters. Economic costs of scenarios have been also compared with each other based on literature. As final result, since the organic materials make more than 80% of the waste, compost can be a suitable method. Although the major part of the remaining 20% of waste can be recycled, due to the high cost of necessary equipment, the landfill option has been suggested. Therefore, the scenario with 80% composting and 20% landfilling is selected as superior environmental and economic scenario. This study shows that, to select a scenario with practical applications, simultaneously environmental and economic aspects of different scenarios must be considered.

Keywords: IWM software, life cycle assessment, Maku, municipal solid waste management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305