Search results for: image process
15902 Automatic Target Recognition in SAR Images Based on Sparse Representation Technique
Authors: Ahmet Karagoz, Irfan Karagoz
Abstract:
Synthetic Aperture Radar (SAR) is a radar mechanism that can be integrated into manned and unmanned aerial vehicles to create high-resolution images in all weather conditions, regardless of day and night. In this study, SAR images of military vehicles with different azimuth and descent angles are pre-processed at the first stage. The main purpose here is to reduce the high speckle noise found in SAR images. For this, the Wiener adaptive filter, the mean filter, and the median filters are used to reduce the amount of speckle noise in the images without causing loss of data. During the image segmentation phase, pixel values are ordered so that the target vehicle region is separated from other regions containing unnecessary information. The target image is parsed with the brightest 20% pixel value of 255 and the other pixel values of 0. In addition, by using appropriate parameters of statistical region merging algorithm, segmentation comparison is performed. In the step of feature extraction, the feature vectors belonging to the vehicles are obtained by using Gabor filters with different orientation, frequency and angle values. A number of Gabor filters are created by changing the orientation, frequency and angle parameters of the Gabor filters to extract important features of the images that form the distinctive parts. Finally, images are classified by sparse representation method. In the study, l₁ norm analysis of sparse representation is used. A joint database of the feature vectors generated by the target images of military vehicle types is obtained side by side and this database is transformed into the matrix form. In order to classify the vehicles in a similar way, the test images of each vehicle is converted to the vector form and l₁ norm analysis of the sparse representation method is applied through the existing database matrix form. As a result, correct recognition has been performed by matching the target images of military vehicles with the test images by means of the sparse representation method. 97% classification success of SAR images of different military vehicle types is obtained.Keywords: automatic target recognition, sparse representation, image classification, SAR images
Procedia PDF Downloads 36615901 Handling Patient's Supply during Inpatient Stay: Using Lean Six Sigma Techniques to Implement a Comprehensive Medication Handling Program
Authors: Erika Duggan
Abstract:
A Major Hospital had identified that there was no standard process for handling a patient’s medication that they brought with them to the hospital. It was also identified that each floor was handling the patient’s medication differently and storing it in multiple locations. Based on this disconnect many patients were leaving the hospital without their medication. The project team was tasked with creating a cohesive process to send a patient’s unneeded medication home on admission, storing any of the patient’s medication that could not be sent home, storing any of the patient’s medication for inpatient administration, and sending all of the patient’s medication home on discharge. The project team consisted of pharmacists, RNs, LPNs, members from nursing informatics and a project engineer and followed a DMAIC framework. Working together observations were performed to identify what was working and not working on the different floors which resulted in process maps. Using the multidisciplinary team, brainstorming, including affinity diagramming and other lean six sigma techniques, the best process for receiving, storing, and returning the medication was created. It was highlighted that being able to track the medication throughout the patient’s stay would be beneficial and would help make sure the medication left with the patient on discharge. Using an automated medications dispensing system would help store, and track patient’s medications. Also, the use of a specific order that would show up on the discharge instructions would assist the front line staff in retrieving the medication from a set location and sending it home with the patient. This new process will effectively streamline the admission and discharge process for patients who brought their medication with them as well as effectively tracking the medication during the patient’s stay. As well as increasing patient safety as it relates to medication administration.Keywords: lean six sigma, medication dispensing, process improvement, process mapping
Procedia PDF Downloads 25415900 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping
Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton
Abstract:
Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern
Procedia PDF Downloads 18215899 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method
Authors: Wassana Naiyapo, Atichat Sangtong
Abstract:
The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.Keywords: classification tree method, test case, UML use case diagram, use case specification
Procedia PDF Downloads 16215898 Tax Expenditures: A Review and Analysis
Authors: Khalid Javed
Abstract:
This study examines a feature of the budget process called the tax expenditure budget. The tax expenditure concept relies heavily on a normative notion that shielding certain. Taxpayer income from taxation deprives government of its rightful revenues. This view is inconsistent with the proposition that income belongs to the taxpayers and that tax liability is determined through the democratic process, not through arbitrary, bureaucratic Assumptions. Furthermore, the methodology of the tax expenditure budget is problematic as its expansive tax base treats the multiple taxation of saving as the norm. By using an expansive view of income as the underlying assumption of the tax expenditure concept, this viewpoint institutionalizes a particular bias into the decision-making process.Keywords: revenue, expenditure, tax budget, propostion
Procedia PDF Downloads 29515897 Conservativeness of Functional Proteins in Bovine Milk by Pulsed Electric Field Technology
Authors: Sulhee Lee, Geon Kim, Young-Seo Park
Abstract:
Unlike the traditional milk sterilization methods (LTLT, HTST, or UHT), pulsed electric field (PEF) technology is a non-thermal pasteurization process. This technology minimizes energy required for heat treatment in food processing, changes in sensory properties, and physical losses. In this study, structural changes of bovine milk proteins, the amount of immunoproteins such as IgG, and their storability by PEF treatment were examined. When the changes of protein content in PEF-treated milk were examined using HPLC, the amounts of α-casein and β-lactoglobulin were reduced over 40% each, whereas those of κ-casein and β-casein did not change. The amount of α-casein in HTST milk was reduced to 50%. When PEF was applied to milk at the energy level of 250 kJ, the amounts of IgG, IgA, β-lactoglobulin (β-LG), lactoferrin, and α-lactalbumin (α-LA) decreased by 43, 41, 35, 63, and 45%, respectively. When milk was sterilized by LTLT process followed by PEF process at the level of 150 kJ, the concentrations of IgG, IgA, β-LG, lactoferrin, and α-LA were 56.6, 10.6, 554, 2.8 and 660.1 μg/mL, respectively. When the bovine milk was sterilized by LTLT process followed by PEF process at the energy level of 180 kJ, storability of immunoproteins of milk was the highest and the concentrations of IgG, IgA, and β-LG decreased by 79.5, 6.5, and 134.5 μg/mL, respectively, when compared with the initial concentrations of those proteins. When bovine milk was stored at 4℃ after sterilization through HTST sterilizer followed by PEF process at the energy level of 200 kJ, the amount of lactoferrin decreased 7.3% after 36 days of storage, whereas that of lactoferrin of raw milk decreased 16.4%. Our results showed that PEF treatment did not change the protein structure nor induce protein denaturation in milk significantly when compared with LTLT or HTST sterilization. Also, LTLT or HTST process in combination with PEF were more effective than LTLT only or HTST only process in the conservation of immunoproteins in bovine milk.Keywords: pulsed electric field, bovine milk, immunoproteins, sterilization
Procedia PDF Downloads 43615896 Identification of Risks Associated with Process Automation Systems
Authors: J. K. Visser, H. T. Malan
Abstract:
A need exists to identify the sources of risks associated with the process automation systems within petrochemical companies or similar energy related industries. These companies use many different process automation technologies in its value chain. A crucial part of the process automation system is the information technology component featuring in the supervisory control layer. The ever-changing technology within the process automation layers and the rate at which it advances pose a risk to safe and predictable automation system performance. The age of the automation equipment also provides challenges to the operations and maintenance managers of the plant due to obsolescence and unavailability of spare parts. The main objective of this research was to determine the risk sources associated with the equipment that is part of the process automation systems. A secondary objective was to establish whether technology managers and technicians were aware of the risks and share the same viewpoint on the importance of the risks associated with automation systems. A conceptual model for risk sources of automation systems was formulated from models and frameworks in literature. This model comprised six categories of risk which forms the basis for identifying specific risks. This model was used to develop a questionnaire that was sent to 172 instrument technicians and technology managers in the company to obtain primary data. 75 completed and useful responses were received. These responses were analyzed statistically to determine the highest risk sources and to determine whether there was difference in opinion between technology managers and technicians. The most important risks that were revealed in this study are: 1) the lack of skilled technicians, 2) integration capability of third-party system software, 3) reliability of the process automation hardware, 4) excessive costs pertaining to performing maintenance and migrations on process automation systems, and 5) requirements of having third-party communication interfacing compatibility as well as real-time communication networks.Keywords: distributed control system, identification of risks, information technology, process automation system
Procedia PDF Downloads 13915895 Olefin and Paraffin Separation Using Simulations on Extractive Distillation
Authors: Muhammad Naeem, Abdulrahman A. Al-Rabiah
Abstract:
Technical mixture of C4 containing 1-butene and n-butane are very close to each other with respect to their boiling points i.e. -6.3°C for 1-butene and -1°C for n-butane. Extractive distillation process is used for the separation of 1-butene from the existing mixture of C4. The solvent is the essential of extractive distillation, and an appropriate solvent shows an important role in the process economy of extractive distillation. Aspen Plus has been applied for the separation of these hydrocarbons as a simulator; moreover NRTL activity coefficient model was used in the simulation. This model indicated that the material balances in this separation process were accurate for several solvent flow rates. Mixture of acetonitrile and water used as a solvent and 99 % pure 1-butene was separated. This simulation proposed the ratio of the feed to solvent as 1 : 7.9 and 15 plates for the solvent recovery column, previously feed to solvent ratio was more than this and the proposed plates were 30, which can economize the separation process.Keywords: extractive distillation, 1-butene, Aspen Plus, ACN solvent
Procedia PDF Downloads 44815894 IOT Based Process Model for Heart Monitoring Process
Authors: Dalyah Y. Al-Jamal, Maryam H. Eshtaiwi, Liyakathunisa Syed
Abstract:
Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.Keywords: IoT, process model, remote patient monitoring system, smart watch
Procedia PDF Downloads 33215893 Parametric Optimization of Wire Electric Discharge Machining (WEDM) for Aluminium Metal Matrix Composites
Authors: G. Rajyalakhmi, C. Karthik, Gerson Desouza, Rimmie Duraisamy
Abstract:
In this present work, metal matrix composites with combination of aluminium with (Sic/Al2O3) were fabricated using stir casting technique. The objective of the present work is to optimize the process parameters of Wire Electric Discharge Machining (WEDM) composites. Pulse ON Time, Pulse OFF Time, wire feed and sensitivity are considered as input process parameters with responses Material Removal Rate (MRR), Surface Roughness (SR) for optimization of WEDM process. Taguchi L18 Orthogonal Array (OA) is used for experimentation. Grey Relational Analysis (GRA) is coupled with Taguchi technique for multiple process parameters optimization. ANOVA (Analysis of Variance) is used for finding the impact of process parameters individually. Finally confirmation experiments were carried out to validate the predicted results.Keywords: parametric optimization, particulate reinforced metal matrix composites, Taguchi-grey relational analysis, WEDM
Procedia PDF Downloads 58115892 Setting Control Limits For Inaccurate Measurements
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: quality control, process control, round-off, measurement, rounding error
Procedia PDF Downloads 9915891 The Dimensions of Culture in the Productive Internationalization Process: An Overview about Brazilian Companies in Bolivia
Authors: Renato Dias Baptista
Abstract:
The purpose of this paper is to analyze the elements of the cultural dimension in the internationalization process of Brazilian companies in Bolivia. This paper is based on research on two major Brazilian transnational companies which have plants in Bolivia. To achieve the objectives, the interconnective characteristics of culture in the process of productive internationalization were analyzed aiming to highlight it as a guiding element opposite the premises of the Brazilian leadership in the integration and development of the continent. The analysis aims to give relevance to the culture of a country and its relations with internationalization.Keywords: culture, transnational, internationalization, Bolivia, Brazil
Procedia PDF Downloads 42115890 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 14015889 Optical-Based Lane-Assist System for Rowing Boats
Authors: Stephen Tullis, M. David DiDonato, Hong Sung Park
Abstract:
Rowing boats (shells) are often steered by a small rudder operated by one of the backward-facing rowers; the attention required of that athlete then slightly decreases the power that that athlete can provide. Reducing the steering distraction would then increase the overall boat speed. Races are straight 2000 m courses with each boat in a 13.5 m wide lane marked by small (~15 cm) widely-spaced (~10 m) buoys, and the boat trajectory is affected by both cross-currents and winds. An optical buoy recognition and tracking system has been developed that provides the boat’s location and orientation with respect to the lane edges. This information is provided to the steering athlete as either: a simple overlay on a video display, or fed to a simplified autopilot system giving steering directions to the athlete or directly controlling the rudder. The system is then effectively a “lane-assist” device but with small, widely-spaced lane markers viewed from a very shallow angle due to constraints on camera height. The image is captured with a lightweight 1080p webcam, and most of the image analysis is done in OpenCV. The colour RGB-image is converted to a grayscale using the difference of the red and blue channels, which provides good contrast between the red/yellow buoys and the water, sky, land background and white reflections and noise. Buoy detection is done with thresholding within a tight mask applied to the image. Robust linear regression using Tukey’s biweight estimator of the previously detected buoy locations is used to develop the mask; this avoids the false detection of noise such as waves (reflections) and, in particular, buoys in other lanes. The robust regression also provides the current lane edges in the camera frame that are used to calculate the displacement of the boat from the lane centre (lane location), and its yaw angle. The interception of the detected lane edges provides a lane vanishing point, and yaw angle can be calculated simply based on the displacement of this vanishing point from the camera axis and the image plane distance. Lane location is simply based on the lateral displacement of the vanishing point from any horizontal cut through the lane edges. The boat lane position and yaw are currently fed what is essentially a stripped down marine auto-pilot system. Currently, only the lane location is used in a PID controller of a rudder actuator with integrator anti-windup to deal with saturation of the rudder angle. Low Kp and Kd values decrease unnecessarily fast return to lane centrelines and response to noise, and limiters can be used to avoid lane departure and disqualification. Yaw is not used as a control input, as cross-winds and currents can cause a straight course with considerable yaw or crab angle. Mapping of the controller with rudder angle “overall effectiveness” has not been finalized - very large rudder angles stall and have decreased turning moments, but at less extreme angles the increased rudder drag slows the boat and upsets boat balance. The full system has many features similar to automotive lane-assist systems, but with the added constraints of the lane markers, camera positioning, control response and noise increasing the challenge.Keywords: auto-pilot, lane-assist, marine, optical, rowing
Procedia PDF Downloads 13215888 Segmentation of Liver Using Random Forest Classifier
Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir
Abstract:
Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.Keywords: CT images, image validation, random forest, segmentation
Procedia PDF Downloads 31315887 Introduction of Integrated Image Deep Learning Solution and How It Brought Laboratorial Level Heart Rate and Blood Oxygen Results to Everyone
Authors: Zhuang Hou, Xiaolei Cao
Abstract:
The general public and medical professionals recognized the importance of accurately measuring and storing blood oxygen levels and heart rate during the COVID-19 pandemic. The demand for accurate contactless devices was motivated by the need for cross-infection reduction and the shortage of conventional oximeters, partially due to the global supply chain issue. This paper evaluated a contactless mini program HealthyPai’s heart rate (HR) and oxygen saturation (SpO2) measurements compared with other wearable devices. In the HR study of 185 samples (81 in the laboratory environment, 104 in the real-life environment), the mean absolute error (MAE) ± standard deviation was 1.4827 ± 1.7452 in the lab, 6.9231 ± 5.6426 in the real-life setting. In the SpO2 study of 24 samples, the MAE ± standard deviation of the measurement was 1.0375 ± 0.7745. Our results validated that HealthyPai utilizing the Integrated Image Deep Learning Solution (IIDLS) framework, can accurately measure HR and SpO2, providing the test quality at least comparable to other FDA-approved wearable devices in the market and surpassing the consumer-grade and research-grade wearable standards.Keywords: remote photoplethysmography, heart rate, oxygen saturation, contactless measurement, mini program
Procedia PDF Downloads 13515886 Recovery of Value-Added Whey Proteins from Dairy Effluent Using Aqueous Two-Phase System
Authors: Perumalsamy Muthiah, Murugesan Thanapalan
Abstract:
The remains of cheese production contain nutritional value added proteins viz., α-Lactalbumin, β-Lactoglobulin representing 80- 90% of the total volume of milk entering the process. Although several possibilities for cheese-whey exploitation have been assayed, approximately half of world cheese-whey production is not treated but is discarded as effluent. It is necessary to develop an effective and environmentally benign extraction process for the recovery of value added cheese whey proteins. Recently aqueous two phase system (ATPS) have emerged as potential separation process, particularly in the field of biotechnology due to the mild conditions of the process, short processing time, and ease of scale-up. In order to design an ATPS process for the recovery of cheese whey proteins, development of phase diagram and the effect of system parameters such as pH, types and the concentrations of the phase forming components, temperature, etc., on the partitioning of proteins were addressed in order to maximize the recovery of proteins. Some of the practical problems encountered in the application of aqueous two-phase systems for the recovery of Cheese whey proteins were also discussed.Keywords: aqueous two-phase system, phase diagram, extraction, cheese whey
Procedia PDF Downloads 41015885 Feasibility Study of Particle Image Velocimetry in the Muzzle Flow Fields during the Intermediate Ballistic Phase
Authors: Moumen Abdelhafidh, Stribu Bogdan, Laboureur Delphine, Gallant Johan, Hendrick Patrick
Abstract:
This study is part of an ongoing effort to improve the understanding of phenomena occurring during the intermediate ballistic phase, such as muzzle flows. A thorough comprehension of muzzle flow fields is essential for optimizing muzzle device and projectile design. This flow characterization has heretofore been almost entirely limited to local and intrusive measurement techniques such as pressure measurements using pencil probes. Consequently, the body of quantitative experimental data is limited, so is the number of numerical codes validated in this field. The objective of the work presented here is to demonstrate the applicability of the Particle Image Velocimetry (PIV) technique in the challenging environment of the propellant flow of a .300 blackout weapon to provide accurate velocity measurements. The key points of a successful PIV measurement are the selection of the particle tracer, their seeding technique, and their tracking characteristics. We have experimentally investigated the aforementioned points by evaluating the resistance, gas dispersion, laser light reflection as well as the response to a step change across the Mach disk for five different solid tracers using two seeding methods. To this end, an experimental setup has been performed and consisted of a PIV system, the combustion chamber pressure measurement, classical high-speed schlieren visualization, and an aerosol spectrometer. The latter is used to determine the particle size distribution in the muzzle flow. The experimental results demonstrated the ability of PIV to accurately resolve the salient features of the propellant flow, such as the under the expanded jet and vortex rings, as well as the instantaneous velocity field with maximum centreline velocities of more than 1000 m/s. Besides, naturally present unburned particles in the gas and solid ZrO₂ particles with a nominal size of 100 nm, when coated on the propellant powder, are suitable as tracers. However, the TiO₂ particles intended to act as a tracer, surprisingly not only melted but also functioned as a combustion accelerator and decreased the number of particles in the propellant gas.Keywords: intermediate ballistic, muzzle flow fields, particle image velocimetry, propellant gas, particle size distribution, under expanded jet, solid particle tracers
Procedia PDF Downloads 16115884 Best-Performing Color Space for Land-Sea Segmentation Using Wavelet Transform Color-Texture Features and Fusion of over Segmentation
Authors: Seynabou Toure, Oumar Diop, Kidiyo Kpalma, Amadou S. Maiga
Abstract:
Color and texture are the two most determinant elements for perception and recognition of the objects in an image. For this reason, color and texture analysis find a large field of application, for example in image classification and segmentation. But, the pioneering work in texture analysis was conducted on grayscale images, thus discarding color information. Many grey-level texture descriptors have been proposed and successfully used in numerous domains for image classification: face recognition, industrial inspections, food science medical imaging among others. Taking into account color in the definition of these descriptors makes it possible to better characterize images. Color texture is thus the subject of recent work, and the analysis of color texture images is increasingly attracting interest in the scientific community. In optical remote sensing systems, sensors measure separately different parts of the electromagnetic spectrum; the visible ones and even those that are invisible to the human eye. The amounts of light reflected by the earth in spectral bands are then transformed into grayscale images. The primary natural colors Red (R) Green (G) and Blue (B) are then used in mixtures of different spectral bands in order to produce RGB images. Thus, good color texture discrimination can be achieved using RGB under controlled illumination conditions. Some previous works investigate the effect of using different color space for color texture classification. However, the selection of the best performing color space in land-sea segmentation is an open question. Its resolution may bring considerable improvements in certain applications like coastline detection, where the detection result is strongly dependent on the performance of the land-sea segmentation. The aim of this paper is to present the results of a study conducted on different color spaces in order to show the best-performing color space for land-sea segmentation. In this sense, an experimental analysis is carried out using five different color spaces (RGB, XYZ, Lab, HSV, YCbCr). For each color space, the Haar wavelet decomposition is used to extract different color texture features. These color texture features are then used for Fusion of Over Segmentation (FOOS) based classification; this allows segmentation of the land part from the sea one. By analyzing the different results of this study, the HSV color space is found as the best classification performance while using color and texture features; which is perfectly coherent with the results presented in the literature.Keywords: classification, coastline, color, sea-land segmentation
Procedia PDF Downloads 24715883 Service Delivery Process in the Luxury Hotel Industry in Dubai: A Hoteliers’ Perspective
Authors: Veronique Gregorec, Prakash Vel, Collins A. Brobbey
Abstract:
Service delivery process in the face of ever changing customer expectations could not have been more important in glamorous Dubai luxury hotel service sector. Based on in-depth discussions with Dubai luxury hotel service pioneers, customer expectations, service processes, customer complaining behavior, and service recovery strategies in the luxury hotel industry are evaluated from the perspectives of service providers. Findings are in agreement with the statement that in the service industry the customer is not always right, and that hotel service providers have acknowledged the need to take extra measures towards individualized and personal service experience delivery. Ultimately, hoteliers set highest standards at all stages of the service delivery process in order to achieve positive and high customer ratings in all customer evaluation areas.Keywords: luxury hotels, Dubai hotels, Dubai hospitality industry, guest service process
Procedia PDF Downloads 49915882 A Process FMEA in Aero Fuel Pump Manufacturing and Conduct the Corrective Actions
Authors: Zohre Soleymani, Meisam Amirzadeh
Abstract:
Many products are safety critical, so proactive analysis techniques are vital for them because these techniques try to identify potential failures before the products are produced. Failure Mode and Effective Analysis (FMEA) is an effective tool in identifying probable problems of product or process and prioritizing them and planning for its elimination. The paper shows the implementation of FMEA process to identify and remove potential troubles of aero fuel pumps manufacturing process and improve the reliability of subsystems. So the different possible causes of failure and its effects along with the recommended actions are discussed. FMEA uses Risk Priority Number (RPN) to determine the risk level. RPN value is depending on Severity(S), Occurrence (O) and Detection (D) parameters, so these parameters need to be determined. After calculating the RPN for identified potential failure modes, the corrective actions are defined to reduce risk level according to assessment strategy and determined acceptable risk level. Then FMEA process is performed again and RPN revised is calculated. The represented results are applied in the format of a case study. These results show the improvement in manufacturing process and considerable reduction in aero fuel pump production risk level.Keywords: FMEA, risk priority number, aero pump, corrective action
Procedia PDF Downloads 28615881 Jointly Optimal Statistical Process Control and Maintenance Policy for Deteriorating Processes
Authors: Lucas Paganin, Viliam Makis
Abstract:
With the advent of globalization, the market competition has become a major issue for most companies. One of the main strategies to overcome this situation is the quality improvement of the product at a lower cost to meet customers’ expectations. In order to achieve the desired quality of products, it is important to control the process to meet the specifications, and to implement the optimal maintenance policy for the machines and the production lines. Thus, the overall objective is to reduce process variation and the production and maintenance costs. In this paper, an integrated model involving Statistical Process Control (SPC) and maintenance is developed to achieve this goal. Therefore, the main focus of this paper is to develop the jointly optimal maintenance and statistical process control policy minimizing the total long run expected average cost per unit time. In our model, the production process can go out of control due to either the deterioration of equipment or other assignable causes. The equipment is also subject to failures in any of the operating states due to deterioration and aging. Hence, the process mean is controlled by an Xbar control chart using equidistant sampling epochs. We assume that the machine inspection epochs are the times when the control chart signals an out-of-control condition, considering both true and false alarms. At these times, the production process will be stopped, and an investigation will be conducted not only to determine whether it is a true or false alarm, but also to identify the causes of the true alarm, whether it was caused by the change in the machine setting, by other assignable causes, or by both. If the system is out of control, the proper actions will be taken to bring it back to the in-control state. At these epochs, a maintenance action can be taken, which can be no action, or preventive replacement of the unit. When the equipment is in the failure state, a corrective maintenance action is performed, which can be minimal repair or replacement of the machine and the process is brought to the in-control state. SMDP framework is used to formulate and solve the joint control problem. Numerical example is developed to demonstrate the effectiveness of the control policy.Keywords: maintenance, semi-Markov decision process, statistical process control, Xbar control chart
Procedia PDF Downloads 9115880 Optimization Technique for the Contractor’s Portfolio in the Bidding Process
Authors: Taha Anjamrooz, Sareh Rajabi, Salwa Bheiry
Abstract:
Selection between the available projects in bidding processes for the contractor is one of the essential areas to concentrate on. It is important for the contractor to choose the right projects within its portfolio during the tendering stage based on certain criteria. It should align the bidding process with its origination strategies and goals as a screening process to have the right portfolio pool to start with. Secondly, it should set the proper framework and use a suitable technique in order to optimize its selection process for concertation purpose and higher efforts during the tender stage with goals of success and winning. In this research paper, a two steps framework proposed to increase the efficiency of the contractor’s bidding process and the winning chance of getting the new projects awarded. In this framework, initially, all the projects pass through the first stage screening process, in which the portfolio basket will be evaluated and adjusted in accordance with the organization strategies to the reduced version of the portfolio pool, which is in line with organization activities. In the second stage, the contractor uses linear programming to optimize the portfolio pool based on available resources such as manpower, light equipment, heavy equipment, financial capability, return on investment, and success rate of winning the bid. Therefore, this optimization model will assist the contractor in utilizing its internal resource to its maximum and increase its winning chance for the new project considering past experience with clients, built-relation between two parties, and complexity in the exertion of the projects. The objective of this research will be to increase the contractor's winning chance in the bidding process based on the success rate and expected return on investment.Keywords: bidding process, internal resources, optimization, contracting portfolio management
Procedia PDF Downloads 14215879 Sustainable Dyeing of Cotton and Polyester Blend Fabric without Reduction Clearing
Authors: Mohammad Tofayel Ahmed, Seung Kook An
Abstract:
In contemporary research world, focus is more set on sustainable products and innovative processes. The global textile industries are putting tremendous effort to achieve a balance between economic development and ecological protection concurrently. The conservation of water sources and environment have become immensely significant issue in textile dyeing production. Accordingly, an attempt has been taken in this study to develop a process to dye polyester blend cotton without reduction clearing process and any extra wash off chemical by simple modification aiming at cost reduction and sustainability. A widely used combination of 60/40 cotton/polyester (c/p) single jersey knitted fabric of 30’s, 180 g/m² was considered for study. Traditionally, pretreatment is done followed by polyester part dyeing, reduction clearing and cotton part dyeing for c/p blend dyeing. But in this study, polyester part is dyed right away followed by pretreatment process and cotton part dyeing by skipping the reduction clearing process diametrically. The dyed samples of both traditional and modified samples were scrutinized by various color fastness tests, dyeing parameters and by consumption of water, steam, power, process time and total batch cost. The modified process in this study showed no necessity of reduction clearing process for polyester blend cotton dyeing. The key issue contributing to avoid the reduction clearing after polyester part dyeing has been the multifunctional effect of NaOH and H₂O₂ while pretreatment of cotton after polyester part dyeing. The results also revealed that the modified process could reduce the consumption of water, steam, power, time and cost remarkably. The bulk trial of modified process demonstrated the well exploitability to dye polyester blend cotton substrate ensuring all fastness and dyeing properties regardless of dyes category, blend ratio, color, and shade percentage thus making the process sustainable, eco-friendly and economical. Furthermore, the proposed method could be applicable to any cellulosic blend with polyester.Keywords: cotton, dyeing, economical, polyester
Procedia PDF Downloads 18915878 Describing Cognitive Decline in Alzheimer's Disease via a Picture Description Writing Task
Authors: Marielle Leijten, Catherine Meulemans, Sven De Maeyer, Luuk Van Waes
Abstract:
For the diagnosis of Alzheimer's disease (AD), a large variety of neuropsychological tests are available. In some of these tests, linguistic processing - both oral and written - is an important factor. Language disturbances might serve as a strong indicator for an underlying neurodegenerative disorder like AD. However, the current diagnostic instruments for language assessment mainly focus on product measures, such as text length or number of errors, ignoring the importance of the process that leads to written or spoken language production. In this study, it is our aim to describe and test differences between cognitive and impaired elderly on the basis of a selection of writing process variables (inter- and intrapersonal characteristics). These process variables are mainly related to pause times, because the number, length, and location of pauses have proven to be an important indicator of the cognitive complexity of a process. Method: Participants that were enrolled in our research were chosen on the basis of a number of basic criteria necessary to collect reliable writing process data. Furthermore, we opted to match the thirteen cognitively impaired patients (8 MCI and 5 AD) with thirteen cognitively healthy elderly. At the start of the experiment, participants were each given a number of tests, such as the Mini-Mental State Examination test (MMSE), the Geriatric Depression Scale (GDS), the forward and backward digit span and the Edinburgh Handedness Inventory (EHI). Also, a questionnaire was used to collect socio-demographic information (age, gender, eduction) of the subjects as well as more details on their level of computer literacy. The tests and questionnaire were followed by two typing tasks and two picture description tasks. For the typing tasks participants had to copy (type) characters, words and sentences from a screen, whereas the picture description tasks each consisted of an image they had to describe in a few sentences. Both the typing and the picture description tasks were logged with Inputlog, a keystroke logging tool that allows us to log and time stamp keystroke activity to reconstruct and describe text production processes. The main rationale behind keystroke logging is that writing fluency and flow reveal traces of the underlying cognitive processes. This explains the analytical focus on pause (length, number, distribution, location, etc.) and revision (number, type, operation, embeddedness, location, etc.) characteristics. As in speech, pause times are seen as indexical of cognitive effort. Results. Preliminary analysis already showed some promising results concerning pause times before, within and after words. For all variables, mixed effects models were used that included participants as a random effect and MMSE scores, GDS scores and word categories (such as determiners and nouns) as a fixed effect. For pause times before and after words cognitively impaired patients paused longer than healthy elderly. These variables did not show an interaction effect between the group participants (cognitively impaired or healthy elderly) belonged to and word categories. However, pause times within words did show an interaction effect, which indicates pause times within certain word categories differ significantly between patients and healthy elderly.Keywords: Alzheimer's disease, keystroke logging, matching, writing process
Procedia PDF Downloads 36615877 Impact of Tablet Based Learning on Continuous Assessment (ESPRIT Smart School Framework)
Authors: Mehdi Attia, Sana Ben Fadhel, Lamjed Bettaieb
Abstract:
Mobile technology has become a part of our daily lives and assist learners (despite their level and age) in their leaning process using various apparatus and mobile devices (laptop, tablets, etc.). This paper presents a new learning framework based on tablets. This solution has been developed and tested in ESPRIT “Ecole Supérieure Privée d’Igénieurie et de Technologies”, a Tunisian school of engineering. This application is named ESSF: Esprit Smart School Framework. In this work, the main features of the proposed solution are listed, particularly its impact on the learners’ evaluation process. Learner’s assessment has always been a critical component of the learning process as it measures students’ knowledge. However, traditional evaluation methods in which the learner is evaluated once or twice each year cannot reflect his real level. This is why a continuous assessment (CA) process becomes necessary. In this context we have proved that ESSF offers many important features that enhance and facilitate the implementation of the CA process.Keywords: continuous assessment, mobile learning, tablet based learning, smart school, ESSF
Procedia PDF Downloads 33415876 Audit Is a Production Performance Tool
Authors: Lattari Samir
Abstract:
The performance of a production process is the result of proper operation where the management tools appear as the key to success through process management which consists of managing and implementing a quality policy, organizing and planning the manufacturing, and thus defining an efficient logic as the main areas covered by production management. To carry out this delicate mission, which requires reconciling often contradictory objectives, the auditor is called upon, who must be able to express an opinion on the effectiveness of the operation of the "production" function. To do this, the auditor must structure his mission in three phases, namely, the preparation phase to assimilate the particularities of this function, the implementation phase and the conclusion phase. The audit is a systematic and independent examination of all the stages of a manufacturing process intended to determine whether the pre-established arrangements for the combination of production factors are respected, whether their implementation is effective and whether they are relevant in relation to the goals.Keywords: audit, performance of process, independent examination, management tools, audit of accounts
Procedia PDF Downloads 7515875 End To End Process to Automate Batch Application
Authors: Nagmani Lnu
Abstract:
Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing
Procedia PDF Downloads 6015874 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 16115873 How Envisioning Process Is Constructed: An Exploratory Research Comparing Three International Public Televisions
Authors: Alexandre Bedard, Johane Brunet, Wendellyn Reid
Abstract:
Public Television is constantly trying to maintain and develop its audience. And to achieve those goals, it needs a strong and clear vision. Vision or envision is a multidimensional process; it is simultaneously a conduit that orients and fixes the future, an idea that comes before the strategy and a mean by which action is accomplished, from a business perspective. Also, vision is often studied from a prescriptive and instrumental manner. Based on our understanding of the literature, we were able to explain how envisioning, as a process, is a creative one; it takes place in the mind and uses wisdom and intelligence through a process of evaluation, analysis and creation. Through an aggregation of the literature, we build a model of the envisioning process, based on past experiences, perceptions and knowledge and influenced by the context, being the individual, the organization and the environment. With exploratory research in which vision was deciphered through the discourse, through a qualitative and abductive approach and a grounded theory perspective, we explored three extreme cases, with eighteen interviews with experts, leaders, politicians, actors of the industry, etc. and more than twenty hours of interviews in three different countries. We compared the strategy, the business model, and the political and legal forces. We also looked at the history of each industry from an inertial point of view. Our analysis of the data revealed that a legitimacy effect due to the audience, the innovation and the creativity of the institutions was at the cornerstone of what would influence the envisioning process. This allowed us to identify how different the process was for Canadian, French and UK public broadcasters, although we concluded that the three of them had a socially constructed vision for their future, based on stakeholder management and an emerging role for the managers: ideas brokers.Keywords: envisioning process, international comparison, television, vision
Procedia PDF Downloads 132