Search results for: air quality classification
3128 An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.Keywords: Human Motion Recognition, Motion representation, Laban Movement Analysis, Discrete Hidden Markov Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7283127 Effects of Energy Consumption on Indoor Air Quality
Authors: M. Raatikainen, J-P. Skön, M. Johansson, K. Leiviskä, M. Kolehmainen
Abstract:
Continuous measurements and multivariate methods are applied in researching the effects of energy consumption on indoor air quality (IAQ) in a Finnish one-family house. Measured data used in this study was collected continuously in a house in Kuopio, Eastern Finland, during fourteen months long period. Consumption parameters measured were the consumptions of district heat, electricity and water. Indoor parameters gathered were temperature, relative humidity (RH), the concentrations of carbon dioxide (CO2) and carbon monoxide (CO) and differential air pressure. In this study, self-organizing map (SOM) and Sammon's mapping were applied to resolve the effects of energy consumption on indoor air quality. Namely, the SOM was qualified as a suitable method having a property to summarize the multivariable dependencies into easily observable two-dimensional map. Accompanying that, the Sammon's mapping method was used to cluster pre-processed data to find similarities of the variables, expressing distances and groups in the data. The methods used were able to distinguish 7 different clusters characterizing indoor air quality and energy efficiency in the study house. The results indicate, that the cost implications in euros of heating and electricity energy vary according to the differential pressure, concentration of carbon dioxide, temperature and season.
Keywords: Indoor air quality, Energy efficiency, Self- organizing map, Sammon's mapping
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18783126 Triangular Geometric Feature for Offline Signature Verification
Authors: Zuraidasahana Zulkarnain, Mohd Shafry Mohd Rahim, Nor Anita Fairos Ismail, Mohd Azhar M. Arsad
Abstract:
Handwritten signature is accepted widely as a biometric characteristic for personal authentication. The use of appropriate features plays an important role in determining accuracy of signature verification; therefore, this paper presents a feature based on the geometrical concept. To achieve the aim, triangle attributes are exploited to design a new feature since the triangle possesses orientation, angle and transformation that would improve accuracy. The proposed feature uses triangulation geometric set comprising of sides, angles and perimeter of a triangle which is derived from the center of gravity of a signature image. For classification purpose, Euclidean classifier along with Voting-based classifier is used to verify the tendency of forgery signature. This classification process is experimented using triangular geometric feature and selected global features. Based on an experiment that was validated using Grupo de Senales 960 (GPDS-960) signature database, the proposed triangular geometric feature achieves a lower Average Error Rates (AER) value with a percentage of 34% as compared to 43% of the selected global feature. As a conclusion, the proposed triangular geometric feature proves to be a more reliable feature for accurate signature verification.
Keywords: biometrics, euclidean classifier, feature extraction, offline signature verification, VOTING-based classifier
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19783125 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity
Authors: Mujtaba Roshan, John A. Schormans
Abstract:
Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.
Keywords: Quality of experience, quality of service, packet loss probability, network capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9413124 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.
Keywords: Convolutional neural networks, coffee bean, peaberry, sorting, support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15533123 A Computer Aided Detection (CAD) System for Microcalcifications in Mammograms - MammoScan mCaD
Authors: Kjersti Engan, Thor Ole Gulsrud, Karl Fredrik Fretheim, Barbro Furebotten Iversen, Liv Eriksen
Abstract:
Clusters of microcalcifications in mammograms are an important sign of breast cancer. This paper presents a complete Computer Aided Detection (CAD) scheme for automatic detection of clustered microcalcifications in digital mammograms. The proposed system, MammoScan μCaD, consists of three main steps. Firstly all potential microcalcifications are detected using a a method for feature extraction, VarMet, and adaptive thresholding. This will also give a number of false detections. The goal of the second step, Classifier level 1, is to remove everything but microcalcifications. The last step, Classifier level 2, uses learned dictionaries and sparse representations as a texture classification technique to distinguish single, benign microcalcifications from clustered microcalcifications, in addition to remove some remaining false detections. The system is trained and tested on true digital data from Stavanger University Hospital, and the results are evaluated by radiologists. The overall results are promising, with a sensitivity > 90 % and a low false detection rate (approx 1 unwanted pr. image, or 0.3 false pr. image).Keywords: mammogram, microcalcifications, detection, CAD, MammoScan μCaD, VarMet, dictionary learning, texture, FTCM, classification, adaptive thresholding
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18073122 The Role of Cognitive Decision Effort in Electronic Commerce Recommendation System
Authors: Cheng-Che Tsai, Huang-Ming Chuang
Abstract:
The purpose of this paper is to explore the role of cognitive decision effort in recommendation system, combined with indicators "information quality" and "service quality" from IS success model to exam the awareness of the user for the "recommended system performance". A total of 411 internet user answered a questionnaire assessing their attention of use and satisfaction of recommendation system in internet book store. Quantitative result indicates following research results. First, information quality of recommended system has obvious influence in consumer shopping decision-making process, and the attitude to use the system. Second, in the process of consumer's shopping decision-making, the recommendation system has no significant influence for consumers to pay lower cognitive decision-making effort. Third, e-commerce platform provides recommendations and information is necessary, but the quality of information on user needs must be considered, or they will be other competitors offer homogeneous services replaced.Keywords: Recommender system, Cognitive decision-making efforts, IS success model, Internet bookstore.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20743121 Process Capability Analysis by Using Statistical Process Control of Rice Polished Cylinder Turning Practice
Authors: S. Bangphan, P. Bangphan, T. Boonkang
Abstract:
Quality control helps industries in improvements of its product quality and productivity. Statistical Process Control (SPC) is one of the tools to control the quality of products that turning practice in bringing a department of industrial engineering process under control. In this research, the process control of a turning manufactured at workshops machines. The varying measurements have been recorded for a number of samples of a rice polished cylinder obtained from a number of trials with the turning practice. SPC technique has been adopted by the process is finally brought under control and process capability is improved.
Keywords: Rice polished cylinder, statistical process control, control charts, process capability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37153120 Microbiological Quality and Safety of Meatball Sold in Payakumbuh City, West Sumatra, Indonesia
Authors: Ferawati, H. Purwanto, Y. F. Kurnia, E. Purwati
Abstract:
The aim of this study was to evaluate the microbiological quality and safety of meatball obtained from five different manufacturers around Payakumbuh City, West Sumatra, Indonesia. Microbiological analysis of meatball sample resulted in aerobic plate count range from 7 log CFU/gr to 8.623 log CFU/gr, respectively. Total coliform ranges from 1.041 log Most Probable Number (MPN)/gr to 3.380 log MPN/gr, respectively. Chemical analysis of meatball sample consisted of borax and formalin content. The result of qualitative detection of borax and formalin content on all meatball samples were not detected. Thus, it remains essential to include the significance of effective hygiene practices as an important safety measure in consumer education programmes.
Keywords: Borax, formalin, meatball, microbiological quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19213119 A Proposal of Community based Facility Management Performance (CbFM) in the Education System of Batubara District in Indonesia
Authors: Amilia Hasbullah, Wan Zahari Wan Yussof, Maziah Ismail
Abstract:
The primary education system in Indonesia involved the community recognized as the school committee, to take a part in the process of achieving the quality of education via the school facility performance, the low level of school committee involvement in the education system has become the issue in the development of education and reflected to the quality of education. This paper will discuss the conceptual framework and methodology for the performance of school committees within the management of school facilities in Batubara district of Indonesia. The concepts of Community based Facility Management (CbFM) and Logometrix are used as a basis to measure the school committee performance in order to address the needs of quality school management. The data will be taken from questionnaires distributed for those who work and use school facilities spread over seven sub district of Batubara, Indonesia. The result of this study is expected to provide a guide for evaluating the performance of existing school committee in improving the quality of education in Indonesia.
Keywords: community based facility management, School facility management, School committee performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20953118 Psyllium (Plantago) Gum as an Effective Edible Coating to Improve Quality and Shelf Life of Fresh-cut Papaya (Carica papaya)
Authors: Basharat Yousuf, Abhaya K. Srivastava
Abstract:
Psyllium gum alone and in combination with sunflower oil was investigated as a possible alternative edible coating for improvement of quality and shelf life of fresh-cut papaya. Different concentrations including 0.5, 1 and 1.5 percent of psyllium gum were used for coating of fresh-cut papaya. In some samples, refined sunflower oil was used as a lipid component to increase the effectiveness of coating in terms of water barrier properties. Soya lecithin was used as an emulsifier in coatings containing oil. Pretreatment with 1% calcium chloride was given to maintain the firmness of fresh-cut papaya cubes. 1% psyllium gum coating was found to yield better results. Further, addition of oil helped to maintain the quality and acted as a barrier to water vapour, therefore, minimizing the weight loss.Keywords: Coating, fresh-cut, gum, papaya, psylllium.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28113117 On the Computation of a Common n-finger Robotic Grasp for a Set of Objects
Authors: Avishai Sintov, Roland Menassa, Amir Shapiro
Abstract:
Industrial robotic arms utilize multiple end-effectors, each for a specific part and for a specific task. We propose a novel algorithm which will define a single end-effector’s configuration able to grasp a given set of objects with different geometries. The algorithm will have great benefit in production lines allowing a single robot to grasp various parts. Hence, reducing the number of endeffectors needed. Moreover, the algorithm will reduce end-effector design and manufacturing time and final product cost. The algorithm searches for a common grasp over the set of objects. The search algorithm maps all possible grasps for each object which satisfy a quality criterion and takes into account possible external wrenches (forces and torques) applied to the object. The mapped grasps are- represented by high-dimensional feature vectors which describes the shape of the gripper. We generate a database of all possible grasps for each object in the feature space. Then we use a search and classification algorithm for intersecting all possible grasps over all parts and finding a single common grasp suitable for all objects. We present simulations of planar and spatial objects to validate the feasibility of the approach.
Keywords: Common Grasping, Search Algorithm, Robotic End-Effector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16753116 Effect of Marginal Quality Groundwater on Yield of Cotton Crop and Soil Salinity Status
Authors: Qureshi, A. L., Mahessar A. A., Dashti, R. K., Yasin S. M.
Abstract:
In this paper, effect of marginal quality groundwater on yield of cotton crop and soil salinity was studied. In this connection, three irrigation treatments each with four replications were applied. These treatments were i) use of canal water (T1), ii) use of marginal quality groundwater from tubewell (T2), and iii) conjunctive use by mixing with the ratio of 1:1 of canal water and marginal quality tubewell water (T3). Water was applied to the crop cultivated in Kharif season 2011; its quantity has been measured using cut-throat flume. Total 11 watering each of 50 mm depth have been applied from 20th April to 20th July, 2011. Further, irrigations were stopped due to monsoon rainfall up to crop harvesting. Maximum crop yield (seed cotton) was observed under T1 which was 1,517 kg/ha followed by T3 (mixed canal and tubewell water) having 1009 kg/ha and T2 i.e. marginal quality groundwater having 709 kg/ha. This concludes that crop yield in T2 and T3 in comparison to T1was reduced by about 53 and 30% respectively. It has been observed that yield of cotton crop is below potential limit for three treatments due to unexpected rainfall at the time of full flowering season; thus the yield was adversely affected. However, salt deposition in soil profiles was not observed that is due to leaching effect of heavy rainfall occurred during monsoon season.
Keywords: Conjunctive Use, Cotton Crop, Groundwater, Soil Salinity Status, Water Use Efficiency (WUE).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23673115 Face Authentication for Access Control based on SVM using Class Characteristics
Authors: SeHun Lim, Sanghoon Kim, Sun-Tae Chung, Seongwon Cho
Abstract:
Face authentication for access control is a face membership authentication which passes the person of the incoming face if he turns out to be one of an enrolled person based on face recognition or rejects if not. Face membership authentication belongs to the two class classification problem where SVM(Support Vector Machine) has been successfully applied and shows better performance compared to the conventional threshold-based classification. However, most of previous SVMs have been trained using image feature vectors extracted from face images of each class member(enrolled class/unenrolled class) so that they are not robust to variations in illuminations, poses, and facial expressions and much affected by changes in member configuration of the enrolled class In this paper, we propose an effective face membership authentication method based on SVM using class discriminating features which represent an incoming face image-s associability with each class distinctively. These class discriminating features are weakly related with image features so that they are less affected by variations in illuminations, poses and facial expression. Through experiments, it is shown that the proposed face membership authentication method performs better than the threshold rule-based or the conventional SVM-based authentication methods and is relatively less affected by changes in member size and membership.Keywords: Face Authentication, Access control, member ship authentication, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15083114 Developing and Implementing Successful Key Performance Indicators
Authors: Marie Mikušová, Viktorie Janečková
Abstract:
Measurement and the following evaluation of performance represent important part of management. The paper focuses on indicators as the basic elements of performance measurement system. It emphasizes a necessity of searching requirements for quality indicators so that they can become part of the useful system. It introduces standpoints for a systematic dividing of indicators so that they have as high as possible informative value of background sources for searching, analysis, designing and using of indicators. It draws attention to requirements for indicators' quality and at the same it deals with some dangers decreasing indicator's informative value. It submits a draft of questions that should be answered at the construction of indicator. It is obvious that particular indicators need to be defined exactly to stimulate the desired behavior in order to attain expected results. In the enclosure a concrete example of the defined indicator in the concrete conditions of a small firm is given. The authors of the paper pay attention to the fact that a quality indicator makes it possible to get to the basic causes of the problem and include the established facts into the company information system. At the same time they emphasize that developing of a quality indicator is a prerequisite for the utilization of the system of measurement in management.Keywords: performance, measurement, firm, indicator
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15563113 Quality Based Approach for Efficient Biologics Manufacturing
Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama
Abstract:
To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.Keywords: Antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16533112 Analytical Authentication of Butter Using Fourier Transform Infrared Spectroscopy Coupled with Chemometrics
Authors: M. Bodner, M. Scampicchio
Abstract:
Fourier Transform Infrared (FT-IR) spectroscopy coupled with chemometrics was used to distinguish between butter samples and non-butter samples. Further, quantification of the content of margarine in adulterated butter samples was investigated. Fingerprinting region (1400-800 cm–1) was used to develop unsupervised pattern recognition (Principal Component Analysis, PCA), supervised modeling (Soft Independent Modelling by Class Analogy, SIMCA), classification (Partial Least Squares Discriminant Analysis, PLS-DA) and regression (Partial Least Squares Regression, PLS-R) models. PCA of the fingerprinting region shows a clustering of the two sample types. All samples were classified in their rightful class by SIMCA approach; however, nine adulterated samples (between 1% and 30% w/w of margarine) were classified as belonging both at the butter class and at the non-butter one. In the two-class PLS-DA model’s (R2 = 0.73, RMSEP, Root Mean Square Error of Prediction = 0.26% w/w) sensitivity was 71.4% and Positive Predictive Value (PPV) 100%. Its threshold was calculated at 7% w/w of margarine in adulterated butter samples. Finally, PLS-R model (R2 = 0.84, RMSEP = 16.54%) was developed. PLS-DA was a suitable classification tool and PLS-R a proper quantification approach. Results demonstrate that FT-IR spectroscopy combined with PLS-R can be used as a rapid, simple and safe method to identify pure butter samples from adulterated ones and to determine the grade of adulteration of margarine in butter samples.
Keywords: Adulterated butter, margarine, PCA, PLS-DA, PLS-R, SIMCA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7813111 Automatic Product Identification Based on Deep-Learning Theory in an Assembly Line
Authors: Fidel Lòpez Saca, Carlos Avilés-Cruz, Miguel Magos-Rivera, José Antonio Lara-Chávez
Abstract:
Automated object recognition and identification systems are widely used throughout the world, particularly in assembly lines, where they perform quality control and automatic part selection tasks. This article presents the design and implementation of an object recognition system in an assembly line. The proposed shapes-color recognition system is based on deep learning theory in a specially designed convolutional network architecture. The used methodology involve stages such as: image capturing, color filtering, location of object mass centers, horizontal and vertical object boundaries, and object clipping. Once the objects are cut out, they are sent to a convolutional neural network, which automatically identifies the type of figure. The identification system works in real-time. The implementation was done on a Raspberry Pi 3 system and on a Jetson-Nano device. The proposal is used in an assembly course of bachelor’s degree in industrial engineering. The results presented include studying the efficiency of the recognition and processing time.Keywords: Deep-learning, image classification, image identification, industrial engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7583110 The Impact of Temporal Impairment on Quality of Experience (QoE) in Video Streaming: A No Reference (NR) Subjective and Objective Study
Authors: Muhammad Arslan Usman, Muhammad Rehan Usman, Soo Young Shin
Abstract:
Live video streaming is one of the most widely used service among end users, yet it is a big challenge for the network operators in terms of quality. The only way to provide excellent Quality of Experience (QoE) to the end users is continuous monitoring of live video streaming. For this purpose, there are several objective algorithms available that monitor the quality of the video in a live stream. Subjective tests play a very important role in fine tuning the results of objective algorithms. As human perception is considered to be the most reliable source for assessing the quality of a video stream subjective tests are conducted in order to develop more reliable objective algorithms. Temporal impairments in a live video stream can have a negative impact on the end users. In this paper we have conducted subjective evaluation tests on a set of video sequences containing temporal impairment known as frame freezing. Frame Freezing is considered as a transmission error as well as a hardware error which can result in loss of video frames on the reception side of a transmission system. In our subjective tests, we have performed tests on videos that contain a single freezing event and also for videos that contain multiple freezing events. We have recorded our subjective test results for all the videos in order to give a comparison on the available No Reference (NR) objective algorithms. Finally, we have shown the performance of no reference algorithms used for objective evaluation of videos and suggested the algorithm that works better. The outcome of this study shows the importance of QoE and its effect on human perception. The results for the subjective evaluation can serve the purpose for validating objective algorithms.Keywords: Objective evaluation, subjective evaluation, quality of experience (QoE), video quality assessment (VQA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16323109 The Recreation Technique Model from the Perspective of Environmental Quality Elements
Authors: G. Gradinaru, S. Olteanu
Abstract:
The quality improvements of the environmental elements could increase the recreational opportunities in a certain area (destination). The technique of the need for recreation focuses on choosing certain destinations for recreational purposes. The basic exchange taken into consideration is the one between the satisfaction gained after staying in that area and the value expressed in money and time allocated. The number of tourists in the respective area, the duration of staying and the money spent including transportation provide information on how individuals rank the place or certain aspects of the area (such as the quality of the environmental elements). For the statistical analysis of the environmental benefits offered by an area through the need of recreation technique, the following stages are suggested: - characterization of the reference area based on the statistical variables considered; - estimation of the environmental benefit through comparing the reference area with other similar areas (having the same environmental characteristics), from the perspective of the statistical variables considered. The model compared in recreation technique faced with a series of difficulties which refers to the reference area and correct transformation of time in money.Keywords: Comparison in recreation technique, the quality of the environmental elements, statistical analysis model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10893108 Growth of Non-Polar a-Plane AlGaN Epilayer with High Crystalline Quality and Smooth Surface Morphology
Authors: Abbas Nasir, Xiong Zhang, Sohail Ahmad, Yiping Cui
Abstract:
Non-polar a-plane AlGaN epilayers of high structural quality have been grown on r-sapphire substrate by using metalorganic chemical vapor deposition (MOCVD). A graded non-polar AlGaN buffer layer with variable aluminium concentration was used to improve the structural quality of the non-polar a-plane AlGaN epilayer. The characterisations were carried out by high-resolution X-ray diffraction (HR-XRD), atomic force microscopy (AFM) and Hall effect measurement. The XRD and AFM results demonstrate that the Al-composition-graded non-polar AlGaN buffer layer significantly improved the crystalline quality and the surface morphology of the top layer. A low root mean square roughness 1.52 nm is obtained from AFM, and relatively low background carrier concentration down to 3.9× cm-3 is obtained from Hall effect measurement.
Keywords: Non-polar AlGaN epilayer, Al composition-graded AlGaN layer, root mean square, background carrier concentration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5343107 Kano’s Model for Clinical Laboratory
Authors: Khaled N. El-Hashmi, Omar K.Gnieber
Abstract:
The clinical laboratory has received considerable recognition globally due to the rapid development of advanced technology, economic demands and its role in a patient’s treatment cycle. Although various cross-domain experiments and practices with respect to clinical laboratory projects are ready for the full swing, the customer needs are still ambiguous and debatable. The purpose of this study is to apply Kano’s model and customer satisfaction matrix to categorize service quality attributes in order to see how well these attributes are able to satisfy customer needs. The result reveals that ten of the 26 service quality attributes have greater impacts on highly increasing customer’s satisfaction and should be taken in consideration firstly.
Keywords: Clinical laboratory, Customer satisfaction matrix, Kano’s Model, Quality Attributes, Voice of Customer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28293106 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank
Authors: Jalal Haghighat Monfared, Zahra Akbari
Abstract:
Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.
Keywords: Business intelligence, business intelligence capability, decision making, decision quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13813105 Knowledge Management in Academic: A Perspective of Academic Research Contribution to Economic Development of a Nation
Authors: Hilary J. Watsilla, Narasimha R. Vajjhala
Abstract:
Information and Communication Technology (ICT) has made information access easier and affordable. Academic research has also benefited from this, with online journals and academic resource readily available by the click of a button. However, there are limited ways of assessing and controlling the quality of the academic research mostly in public institution. Nigeria is the most populous country in Africa with a significant number of universities and young population. The quality of knowledge created by academic researchers, however, needs to be evaluated due to the high number of predatory journals published by academia. The purpose of this qualitative study is to look at the knowledge creation, acquisition, and assimilation process by academic researchers in public universities in Nigeria. Qualitative research will be carried out using in-depth interviews and observations. Academic researchers will be interviewed and absorptive capacity theory will be used as the theoretical framework to guide the research. The findings from this study should help understand the impact of ICT on the knowledge creation process in academic research and to understand how ICT can affect the quality of knowledge produced by researchers. The findings from this study should help add value to the existing body of knowledge on the quality of academic research, especially in Africa where there is limited availability of quality academic research. As this study is limited to Nigerian universities, the outcome may not be generalized to other developing countries.
Keywords: Knowledge creation, academic research, knowledge management, information and communication technology, research, university.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13003104 Ventilation Efficiency in the Subway Environment for the Indoor Air Quality
Authors: Kyung Jin Ryu, MakhsudaJuraeva, Sang-Hyun Jeongand Dong Joo Song
Abstract:
Clean air in subway station is important to passengers. The Platform Screen Doors (PSDs) can improve indoor air quality in the subway station; however the air quality in the subway tunnel is degraded. The subway tunnel has high CO2 concentration and indoor particulate matter (PM) value. The Indoor Air Quality (IAQ) level in subway environment degrades by increasing the frequency of the train operation and the number of the train. The ventilation systems of the subway tunnel need improvements to have better air-quality. Numerical analyses might be effective tools to analyze the performance of subway twin-track tunnel ventilation systems. An existing subway twin-track tunnel in the metropolitan Seoul subway system is chosen for the numerical simulations. The ANSYS CFX software is used for unsteady computations of the airflow inside the twin-track tunnel when the train moves. The airflow inside the tunnel is simulated when one train runs and two trains run at the same time in the tunnel. The piston-effect inside the tunnel is analyzed when all shafts function as the natural ventilation shaft. The supplied air through the shafts is mixed with the pollutant air in the tunnel. The pollutant air is exhausted by the mechanical ventilation shafts. The supplied and discharged airs are balanced when only one train runs in the twin-track tunnel. The pollutant air in the tunnel is high when two trains run simultaneously in opposite direction and all shafts functioned as the natural shaft cases when there are no electrical power supplies in the shafts. The remained pollutant air inside the tunnel enters into the station platform when the doors are opened.
Keywords: indoor air quality, subway twin-track tunnel, train-induced wind
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43433103 Effect of Bio-Nitrogen as a Partial Alternative to Mineral-Nitrogen Fertiliser on Growth, Nitrate and Nitrite Contents, and Yield Quality in Brassica oleracea L.
Authors: Saad M. Howladar, Mostafa M. Rady, Ashraf Sh. Osman
Abstract:
Effects of bio-nitrogen fertilizer (bio-N), as a partial alternative to mineral-nitrogen fertilizer (mineral-N), on growth, yield and yield quality of broccoli plants were investigated. Bio-N was applied at 1, 2 or 3 doses in combination with 65% of the recommended dose of mineral-N (bio-N1, bio-N2 or bio-N3 + ⅔mineral-N). However, 100% of the recommended dose of mineral- N was applied as a control. Significant positive influences of the bio- N3 + ⅔mineral-N treatment were observed on growth traits, leaf contents of nitrogen, phosphorus, potassium, nitrate and nitrite, and yield quality when compared to the other two combined treatments. In contrast, there were no significant differences in these parameters between the bio-N3 + ⅔mineral-N and the control treatments, except for leaf contents of nitrate and nitrite. They showed lower contents in the bio-N3 + ⅔mineral-N treatment than the control. Therefore, we recommend using bio-N as a partial alternative to mineral-N for healthy nutrition.
Keywords: Bio-fertilization, broccoli, growth, nitrate, nitrite, yield quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24843102 Accreditation and Quality Assurance of Nigerian Universities: The Management Imperative
Authors: F. O Anugom
Abstract:
The general functions of the university amongst other things include teaching, research and community service. Universities are recognized as the apex of learning, accumulating and imparting knowledge and skills of all kinds to students to enable them to be productive, earn their living and to make optimum contributions to national development. This is equivalent to the production of human capital in the form of high level manpower needed to administer the educational society, be useful to the society and manage the economy. Quality has become a matter of major importance for university education in Nigeria. Accreditation is the systematic review of educational programs to ensure that acceptable standards of education, scholarship and infrastructure are being maintained. Accreditation ensures that institution maintain quality. The process is designed to determine whether or not an institution has met or exceeded the published standards for accreditation, and whether it is achieving its mission and stated purposes. Ensuring quality assurance in accreditation process falls in the hands of university management which justified the need for this study. This study examined accreditation and quality assurance: the management imperative. Three research questions and three hypotheses guided the study. The design was a correlation survey with a population of 2,893 university administrators out of which 578 Heads of department and Dean of faculties were sampled. The instrument for data collection was titled Programme Accreditation Exercise scale with high levels of reliability. The research questions were answered with Pearson ‘r’ statistics. T-test statistics was used to test the hypotheses. It was found among others that the quality of accredited programme depends on the level of funding of universities in Nigeria. It was also indicated that quality of programme accreditation and physical facilities of universities in Nigeria have high relationship. But it was also revealed that programme accreditation is positively related to staffing in Nigerian universities. Based on the findings of the study, the researcher recommend that academic administrators should be included in the team of those who ensure quality programs in the universities. Private sector partnership should be encouraged to fund programs to ensure quality of programme in the universities. Independent agencies should be engaged to monitor the activities of accreditation teams to avoid bias.
Keywords: Accreditation, quality assurance, NUC, physical facilities, staffing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19273101 ECG-Based Heartbeat Classification Using Convolutional Neural Networks
Authors: Jacqueline R. T. Alipo-on, Francesca I. F. Escobar, Myles J. T. Tan, Hezerul Abdul Karim, Nouar AlDahoul
Abstract:
Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases which are considered as one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis on the ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heart beat types. The dataset used in this work is the synthetic MIT-Beth Israel Hospital (MIT-BIH) Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.
Keywords: Heartbeat classification, convolutional neural network, electrocardiogram signals, ECG signals, generative adversarial networks, long short-term memory, LSTM, ResNet-50.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1883100 Masquerade and “What Comes Behind Six Is More Than Seven”: Thoughts on Art History and Visual Culture Research Methods
Authors: Osa D Egonwa
Abstract:
In the 21st century, the disciplinary boundaries of past centuries that we often create through mainstream art historical classification, techniques and sources may have been eroded by visual culture, which seems to provide a more inclusive umbrella for the new ways artists go about the creative process and its resultant commodities. Over the past four decades, artists in Africa have resorted to new materials, techniques and themes which have affected our ways of research on these artists and their art. Frontline artists such as El Anatsui, Yinka Shonibare, Erasmus Onyishi are demonstrating that any material is just suitable for artistic expression. Most of times, these materials come with their own techniques/effects and visual syntax: a combination of materials compounds techniques, formal aesthetic indexes, halo effects, and iconography. This tends to challenge the categories and we lean on to view, think and talk about them. This renders our main stream art historical research methods inadequate, thus suggesting new discursive concepts, terms and theories. This paper proposed the Africanist eclectic methods derived from the dual framework of Masquerade Theory and What Comes Behind Six is More Than Seven. This paper shares thoughts/research on art historical methods, terminological re-alignments on classification/source data, presentational format and interpretation arising from the emergent trends in our subject. The outcome provides useful tools to mediate new thoughts and experiences in recent African art and visual culture.
Keywords: Art Historical Methods, Classifications, Concepts , Re-alignment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6383099 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius
Authors: M. A. S. Fahim, J. Sužiedelytė Visockienė
Abstract:
With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realization often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.
Keywords: Air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 117