Search results for: image process
15962 Effect of Injection Moulding Process Parameter on Tensile Strength of Using Taguchi Method
Authors: Gurjeet Singh, M. K. Pradhan, Ajay Verma
Abstract:
The plastic industry plays very important role in the economy of any country. It is generally among the leading share of the economy of the country. Since metals and their alloys are very rarely available on the earth. So to produce plastic products and components, which finds application in many industrial as well as household consumer products is beneficial. Since 50% plastic products are manufactured by injection moulding process. For production of better quality product, we have to control quality characteristics and performance of the product. The process parameters plays a significant role in production of plastic, hence the control of process parameter is essential. In this paper the effect of the parameters selection on injection moulding process has been described. It is to define suitable parameters in producing plastic product. Selecting the process parameter by trial and error is neither desirable nor acceptable, as it is often tends to increase the cost and time. Hence optimization of processing parameter of injection moulding process is essential. The experiments were designed with Taguchi’s orthogonal array to achieve the result with least number of experiments. Here Plastic material polypropylene is studied. Tensile strength test of material is done on universal testing machine, which is produced by injection moulding machine. By using Taguchi technique with the help of MiniTab-14 software the best value of injection pressure, melt temperature, packing pressure and packing time is obtained. We found that process parameter packing pressure contribute more in production of good tensile plastic product.Keywords: injection moulding, tensile strength, poly-propylene, Taguchi
Procedia PDF Downloads 28815961 Remote Sensing Reversion of Water Depths and Water Management for Waterbird Habitats: A Case Study on the Stopover Site of Siberian Cranes at Momoge, China
Authors: Chunyue Liu, Hongxing Jiang
Abstract:
Traditional water depth survey of wetland habitats used by waterbirds needs intensive labor, time and money. The optical remote sensing image relies on passive multispectral scanner data has been widely employed to study estimate water depth. This paper presents an innovative method for developing the water depth model based on the characteristics of visible and thermal infrared spectra of Landsat ETM+ image, combing with 441 field water depth data at Etoupao shallow wetland. The wetland is located at Momoge National Nature Reserve of Northeast China, where the largest stopover habitat along the eastern flyway of globally, critically-endangered Siberian Cranes are. The cranes mainly feed on the tubers of emergent aquatic plants such as Scirpus planiculmis and S. nipponicus. The effective water control is a critical step for maintaining the production of tubers and food availability for this crane. The model employing multi-band approach can effectively simulate water depth for this shallow wetland. The model parameters of NDVI and GREEN indicated the vegetation growth and coverage affecting the reflectance from water column change are uneven. Combining with the field-observed water level at the same date of image acquisition, the digital elevation model (DEM) for the underwater terrain was generated. The wetland area and water volume of different water levels were then calculated from the DEM using the function of Area and Volume Statistics under the 3D Analyst of ArcGIS 10.0. The findings provide good references to effectively monitor changes in water level and water demand, develop practical plan for water level regulation and water management, and to create best foraging habitats for the cranes. The methods here can be adopted for the bottom topography simulation and water management in waterbirds’ habitats, especially in the shallow wetlands.Keywords: remote sensing, water depth reversion, shallow wetland habitat management, siberian crane
Procedia PDF Downloads 25215960 Research on Tight Sandstone Oil Accumulation Process of the Third Member of Shahejie Formation in Dongpu Depression, China
Authors: Hui Li, Xiongqi Pang
Abstract:
In recent years, tight oil has become a hot spot for unconventional oil and gas exploration and development in the world. Dongpu Depression is a typical hydrocarbon-rich basin in the southwest of Bohai Bay Basin, in which tight sandstone oil and gas have been discovered in deep reservoirs, most of which are buried more than 3500m. The distribution and development characteristics of deep tight sandstone reservoirs need to be studied. The main source rocks in study area are dark mudstone and shale of the middle and lower third sub-member of Shahejie Formation. Total Organic Carbon (TOC) content of source rock is between 0.08-11.54%, generally higher than 0.6% and the value of S1+S2 is between 0.04–72.93 mg/g, generally higher than 2 mg/g. It can be evaluated as middle to fine level overall. The kerogen type of organic matter is predominantly typeⅡ1 andⅡ2. Vitrinite reflectance (Ro) is mostly greater than 0.6% indicating that the source rock entered the hydrocarbon generation threshold. The physical property of reservoir was poor, the most reservoir has a porosity lower than 12% and a permeability of less than 1×10⁻³μm. The rocks in this area showed great heterogeneity, some areas developed desserts with high porosity and permeability. According to SEM, thin section image, inclusion test and so on, the reservoir was affected by compaction and cementation during early diagenesis stage (44-31Ma). The diagenesis caused the tight reservoir in Huzhuangji, Pucheng, Weicheng Area while the porosity in Machang, Qiaokou, Wenliu Area was still over 12%. In the process of middle diagenesis phase stage A (31-17Ma), the reservoir porosity in Machang, Pucheng, Huzhuangji Area increased due to dissolution; after that the oil generation window of source rock was achieved for the first phase hydrocarbon charging (31-23Ma), formed the conventional oil deposition in Machang, Qiaokou, Wenliu, Huzhuangji Area and unconventional tight reservoir in Pucheng, Weicheng Area. Then came to stage B of middle diagenesis phase (17-7Ma), in this stage, the porosity of reservoir continued to decrease after the dissolution and led to a situation that the reservoirs were generally compacted. And since then, the second hydrocarbon filling has been processing since 7Ma. Most of the pools charged and formed in this procedure are tight sandstone oil reservoir. In conclusion, tight sandstone oil was formed in two patterns in Dongpu Depression, which could be concluded as ‘density fist then accumulation’ pattern and ‘accumulation fist next density’ pattern.Keywords: accumulation process, diagenesis, dongpu depression, tight sandstone oil
Procedia PDF Downloads 11615959 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 21915958 Optical Characterization of Anisotropic Thiophene-Phenylene Co-Oligomer Micro Crystals by Spectroscopic Imaging Ellipsometry
Authors: Christian Röling, Elena Y. Poimanova, Vladimir V. Bruevich
Abstract:
Here we demonstrate a non-destructive optical technique to localize and characterize single crystals of semiconductive organic materials – Spectroscopic Imaging Ellipsometry. With a combination of microscopy and ellipsometry, it is possible to characterize even micro-sized thin film crystals on plane surface regarding anisotropy, optical properties, crystalline domains and thickness. The semiconducting thiophene-phenylene co-oligomer 1,4-bis(5'-hexyl-[2,2'-bithiophen]-5-yl)benzene (dHex-TTPTT) crystals were grown by solvent based self-assembly technique on silicon substrate with 300 nm thermally silicon dioxide. The ellipsometric measurements were performed with an Ep4-SE (Accurion). In an ellipsometric high-contrast image of the complete sample, we have localized high-quality single crystals. After demonstrating the uniaxial anisotropy of the crystal by using Müller-Matrix imaging ellipsometry, we determined the optical axes by rotating the sample and performed spectroscopic measurements (λ = 400-700 nm) in 5 nm intervals. The optical properties were described by using a Lorentz term in the Ep4-Model. After determining the dispersion of the crystals, we converted a recorded Delta and Psi-map into a 2D thickness image. Based on a quantitative analysis of the resulting thickness map, we have calculated the height of a molecular layer (3.49 nm).Keywords: anisotropy, ellipsometry, SCFET, thin film
Procedia PDF Downloads 25115957 The Philippine Collegian and the Catalyst's Journalistic Presentation of the UP and PUP: A Content Analysis
Authors: Diana Mariz Catangay, Irish-Ann Montano, Frances Janine Suyat
Abstract:
As an active pedestal for student’s interaction with both issues happening inside the school and out; may it be political, societal, international, or other current events, a school paper should at least meet the standard of providing a representation of the school’s morals and values and help the institution uplift its image. The researchers seek to ascertain how the two student publications from the Philippines’ two prime state universities, the University of the Philippines’ Philippine Collegian, and the Polytechnic University of the Philippines’ the Catalyst, presents iii their school through balanced journalism and objective documentation. The objectives include determining the number of school-related articles published versus those articles that are concerned outside the school’s jurisdiction, analyzing the insight it provides on the image of the university, assessing the similarities and/or differences between the two publications, and, finally, coming up with the conclusion of how the two newspapers uses their medium to present their respective schools. The research used the quantitative method of research in order to further analyze the articles that will serve as bases in coming up with the right conclusion based on the objectives of the study. Coding sheets and coding guides are utilized for the chosen research method. The gathered findings will then be interpreted as fitting to the goal of the research.Keywords: content analysis, journalistic presentation, student publications, state universities
Procedia PDF Downloads 18115956 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology
Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James
Abstract:
Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing
Procedia PDF Downloads 13315955 Using Nonhomogeneous Poisson Process with Compound Distribution to Price Catastrophe Options
Authors: Rong-Tsorng Wang
Abstract:
In this paper, we derive a pricing formula for catastrophe equity put options (or CatEPut) with non-homogeneous loss and approximated compound distributions. We assume that the loss claims arrival process is a nonhomogeneous Poisson process (NHPP) representing the clustering occurrences of loss claims, the size of loss claims is a sequence of independent and identically distributed random variables, and the accumulated loss distribution forms a compound distribution and is approximated by a heavy-tailed distribution. A numerical example is given to calibrate parameters, and we discuss how the value of CatEPut is affected by the changes of parameters in the pricing model we provided.Keywords: catastrophe equity put options, compound distributions, nonhomogeneous Poisson process, pricing model
Procedia PDF Downloads 16715954 Analysis on Yogyakarta Istimewa Citygates on Urban Area Arterial Roads
Authors: Nizar Caraka Trihanasia, Suparwoko
Abstract:
The purpose of this paper is to analyze the design model of city gates on arterial roads as Yogyakarta’s “Istimewa” (special) identity. City marketing has become a trend among cities in the past few years. It began to compete with each other in promoting their identity to the world. One of the easiest ways to recognize the identity is by knowing the image of the city which can be seen through architectural buildings or urban elements. The idea is to recognize how the image of the city can represent Yogyakarta’s identity, which is limited to the contribution of the city gates distinctiveness on Yogyakarta urban area. This study has concentrated on the aspect of city gates as built environment that provides a diversity, configuration and scale of development that promotes a sense of place and community. The visual analysis will be conducted to interpreted the existing Yogyakarta city gates (as built environment) focussing on some variables of 1) character and pattern, 2) circulation system establishment, and 3) open space utilisation. Literature review and site survey are also conducted to understand the relationship between the built environment and the sense of place in the community. This study suggests that visually the Yogyakarta city gate model has strong visual characters and pattern by using the concept of a sense of place of Yogyakarta community value.Keywords: visual analysis, model, Yogyakarta “Istimewa”, citygates
Procedia PDF Downloads 25815953 The Role of Media Relations in the Brand Image: Case Study in Three Brands of the Automobile Industry
Authors: Rosa Sobreira, Paula Arriscado
Abstract:
Marketers are aware that media relations is an important touch point, which is also cheaper, to bring their products and their brands to the consumer. They recognize the role of journalists as moderators and transformers of public opinion, and they realize their influence on brand image. And also, they know that readers, listeners, viewers and internet users "believe" more what they read, hear and see in the news than in an advertisement. The study is focused on the automotive industry and analyses the news published about three brands that share industrial facilities and components. We wanted to understand the role of the information created by the brand`s media team in the journalists’ work, and the impact on management, activation and differentiation of brands and their products` attributes and benefits. Based on a qualitative methodology, the analysis focused on press news, making comparison between media coverage and their “narratives” about the three cars from different brands. The results point to the fact that journalists easily integrate speech from the marks on their products. In the case of this study, we found that apart from the description of the many similarities between the three cars, the average speech also "struggled" for revealing the attributes that differentiate them. This interpretation of the results helps us to understand the "marriage" between branding and media. We believe also this paper let us to understand how journalists, through news, join the speech of the brands.Keywords: brand management, media relations, differentiation, positioning
Procedia PDF Downloads 22515952 The Effect of Online Learning During the COVID-19 Pandemic on Student Mental
Authors: Adelia Desi Agnesita
Abstract:
The advent of a new disease called covid-19 made many major changes in the world, one of which is the process of learning and teaching. Learning formerly offline but now is done online, which makes students need adaptation to the learning process. The covid-19 pandemic that occurs almost worldwide causes activities that involve many people to be avoided, one of which is learning to teach. In Indonesia, since March 2020, the process of college learning is turning into online/ long-distance learning. It's to prevent the spread of the covid-19. Student online learning presents some of the obstacles to poor signals, many of the tasks, lack of focus, difficulty sleeping, and resulting stress.Keywords: learning, online, covid-19, pandemic
Procedia PDF Downloads 21415951 Experimental Investigations on the Mechanism of Stratified Liquid Mixing in a Cylinder
Authors: Chai Mingming, Li Lei, Lu Xiaoxia
Abstract:
In this paper, the mechanism of stratified liquids’ mixing in a cylinder is investigated. It is focused on the effects of Rayleigh-Taylor Instability (RTI) and rotation of the cylinder on liquid interface mixing. For miscible liquids, Planar Laser Induced Fluorescence (PLIF) technique is applied to record the concentration field for one liquid. Intensity of Segregation (IOS) is used to describe the mixing status. For immiscible liquids, High Speed Camera is adopted to record the development of the interface. The experiment of RTI indicates that it plays a great role in the mixing process, and meanwhile the large-scale mixing is triggered, and subsequently the span of the stripes decreases, showing that the mesoscale mixing is coming into being. The rotation experiments show that the spin-down process has a great role in liquid mixing, during which the upper liquid falls down rapidly along the wall and crashes into the lower liquid. During this process, a lot of interface instabilities are excited. Liquids mix rapidly in the spin-down process. It can be concluded that no matter what ways have been adopted to speed up liquid mixing, the fundamental reason is the interface instabilities which increase the area of the interface between liquids and increase the relative velocity of the two liquids.Keywords: interface instability, liquid mixing, Rayleigh-Taylor Instability, spin-down process, spin-up process
Procedia PDF Downloads 30115950 Hyperspectral Data Classification Algorithm Based on the Deep Belief and Self-Organizing Neural Network
Authors: Li Qingjian, Li Ke, He Chun, Huang Yong
Abstract:
In this paper, the method of combining the Pohl Seidman's deep belief network with the self-organizing neural network is proposed to classify the target. This method is mainly aimed at the high nonlinearity of the hyperspectral image, the high sample dimension and the difficulty in designing the classifier. The main feature of original data is extracted by deep belief network. In the process of extracting features, adding known labels samples to fine tune the network, enriching the main characteristics. Then, the extracted feature vectors are classified into the self-organizing neural network. This method can effectively reduce the dimensions of data in the spectrum dimension in the preservation of large amounts of raw data information, to solve the traditional clustering and the long training time when labeled samples less deep learning algorithm for training problems, improve the classification accuracy and robustness. Through the data simulation, the results show that the proposed network structure can get a higher classification precision in the case of a small number of known label samples.Keywords: DBN, SOM, pattern classification, hyperspectral, data compression
Procedia PDF Downloads 34115949 Aerogel Fabrication Via Modified Rapid Supercritical Extraction (RSCE) Process - Needle Valve Pressure Release
Authors: Haibo Zhao, Thomas Andre, Katherine Avery, Alper Kiziltas, Deborah Mielewski
Abstract:
Silica aerogels were fabricated through a modified rapid supercritical extraction (RSCE) process. The silica aerogels were made using a tetramethyl orthosilicate precursor and then placed in a hot press and brought to the supercritical point of the solvent, ethanol. In order to control the pressure release without a pressure controller, a needle valve was used. The resulting aerogels were then characterized for their physical and chemical properties and compared to silica aerogels created using similar methods. The aerogels fabricated using this modified RSCE method were found to have similar properties to those in other papers using the unmodified RSCE method. Silica aerogel infused glass blanket composite, graphene reinforced silica aerogel composite were also successfully fabricated by this new method. The modified RSCE process and system is a prototype for better gas outflow control with a lower cost of equipment setup. Potentially, this process could be evolved to a continuous low-cost high-volume production process to meet automotive requirements.Keywords: aerogel, automotive, rapid supercritical extraction process, low cost production
Procedia PDF Downloads 18415948 A Survey of 2nd Year Students' Frequent Writing Error and the Effects of Participatory Error Correction Process
Authors: Chaiwat Tantarangsee
Abstract:
The purposes of this study are 1) to study the effects of participatory error correction process and 2) to find out the students’ satisfaction of such error correction process. This study is a Quasi Experimental Research with single group, in which data is collected 5 times preceding and following 4 experimental studies of participatory error correction process including providing coded indirect corrective feedback in the students’ texts with error treatment activities. Samples include 28 2nd year English Major students, Faculty of Humanities and Social Sciences, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tools for data collection include 5 writing tests of short texts and a questionnaire. Based on formative evaluation of the students’ writing ability prior to and after each of the 4 experiments, the research findings disclose the students’ higher scores with statistical difference at 0.05. Moreover, in terms of the effect size of such process, it is found that for mean of the students’ scores prior to and after the 4 experiments; d equals 1.0046, 1.1374, 1.297, and 1.0065 respectively. It can be concluded that participatory error correction process enables all of the students to learn equally well and there is improvement in their ability to write short texts. Finally, the students’ overall satisfaction of the participatory error correction process is in high level (Mean=4.32, S.D.=0.92).Keywords: coded indirect corrective feedback, participatory error correction process, error treatment, humanities and social sciences
Procedia PDF Downloads 52315947 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 27915946 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence
Procedia PDF Downloads 44415945 Image Multi-Feature Analysis by Principal Component Analysis for Visual Surface Roughness Measurement
Authors: Wei Zhang, Yan He, Yan Wang, Yufeng Li, Chuanpeng Hao
Abstract:
Surface roughness is an important index for evaluating surface quality, needs to be accurately measured to ensure the performance of the workpiece. The roughness measurement based on machine vision involves various image features, some of which are redundant. These redundant features affect the accuracy and speed of the visual approach. Previous research used correlation analysis methods to select the appropriate features. However, this feature analysis is independent and cannot fully utilize the information of data. Besides, blindly reducing features lose a lot of useful information, resulting in unreliable results. Therefore, the focus of this paper is on providing a redundant feature removal approach for visual roughness measurement. In this paper, the statistical methods and gray-level co-occurrence matrix(GLCM) are employed to extract the texture features of machined images effectively. Then, the principal component analysis(PCA) is used to fuse all extracted features into a new one, which reduces the feature dimension and maintains the integrity of the original information. Finally, the relationship between new features and roughness is established by the support vector machine(SVM). The experimental results show that the approach can effectively solve multi-feature information redundancy of machined surface images and provides a new idea for the visual evaluation of surface roughness.Keywords: feature analysis, machine vision, PCA, surface roughness, SVM
Procedia PDF Downloads 21215944 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs
Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao
Abstract:
A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.Keywords: tight reservoirs, cyclic O₂ injection, asphaltene, solubility, reservoir simulation
Procedia PDF Downloads 38615943 Defect Management Life Cycle Process for Software Quality Improvement
Authors: Aedah Abd Rahman, Nurdatillah Hasim
Abstract:
Software quality issues require special attention especially in view of the demands of quality software product to meet customer satisfaction. Software development projects in most organisations need proper defect management process in order to produce high quality software product and reduce the number of defects. The research question of this study is how to produce high quality software and reducing the number of defects. Therefore, the objective of this paper is to provide a framework for managing software defects by following defined life cycle processes. The methodology starts by reviewing defects, defect models, best practices and standards. A framework for defect management life cycle is proposed. The major contribution of this study is to define a defect management road map in software development. The adoption of an effective defect management process helps to achieve the ultimate goal of producing high quality software products and contributes towards continuous software process improvement.Keywords: defects, defect management, life cycle process, software quality
Procedia PDF Downloads 30615942 Treadmill Negotiation: The Stagnation of the Israeli – Palestinian Peace Process
Authors: Itai Kohavi, Wojciech Nowiak
Abstract:
This article explores the stagnation of the Israeli -Palestinian peace negotiation process, and the reasons behind the failure of more than 12 international initiatives to resolve the conflict. Twenty-seven top members of the Israeli national security elite (INSE) were interviewed, including heads of the negotiation teams, the National Security Council, the Mossad, and other intelligence and planning arms. The interviewees provided their insights on the Israeli challenges in reaching a sustainable and stable peace agreement and in dealing with the international pressure on Israel to negotiate a peace agreement while preventing anti-Israeli UN decisions and sanctions. The findings revealed a decision tree, with red herring deception strategies implemented to postpone the negotiation process and to delay major decisions during the negotiation process. Beyond the possible applications for the Israeli – Palestinian conflict, the findings shed more light on the phenomenon of rational deception of allies in a negotiation process, a subject less frequently researched as compared with deception of rivals.Keywords: deception, Israeli-Palestinian conflict, negotiation, red herring, terrorist state, treadmill negotiation
Procedia PDF Downloads 30315941 Ischemic Stroke Detection in Computed Tomography Examinations
Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina
Abstract:
Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means
Procedia PDF Downloads 36615940 Distribution-Free Exponentially Weighted Moving Average Control Charts for Monitoring Process Variability
Authors: Chen-Fang Tsai, Shin-Li Lu
Abstract:
Distribution-free control chart is an oncoming area from the statistical process control charts in recent years. Some researchers have developed various nonparametric control charts and investigated the detection capability of these charts. The major advantage of nonparametric control charts is that the underlying process is not specifically considered the assumption of normality or any parametric distribution. In this paper, two nonparametric exponentially weighted moving average (EWMA) control charts based on nonparametric tests, namely NE-S and NE-M control charts, are proposed for monitoring process variability. Generally, weighted moving average (GWMA) control charts are extended by utilizing design and adjustment parameters for monitoring the changes in the process variability, namely NG-S and NG-M control charts. Statistical performance is also investigated on NG-S and NG-M control charts with run rules. Moreover, sensitivity analysis is performed to show the effects of design parameters under the nonparametric NG-S and NG-M control charts.Keywords: Distribution-free control chart, EWMA control charts, GWMA control charts
Procedia PDF Downloads 27215939 A System for Visual Management of Research Resources Focusing on Accumulation of Polish Processes
Authors: H. Anzai, H. Nakayama, H. Kaminaga, Y. Morimoto, Y. Miyadera, S. Nakamura
Abstract:
Various research resources such as papers and presentation slides are handled in the process of research activities. It is extremely important for smooth progress of the research to skillfully manage those research resources and utilize them for further investigations. However, number of the research resources increases more and more. Moreover, there are the differences in usage of each kind of research resource and their accumulation styles. So, it is actually difficult to satisfactorily manage and use the accumulated research resources. Therefore, a lack of tidiness of the resources causes the problems such as an oversight of the problem to be polish. Although there have existed research projects on support for management of research resources and for sharing of know-how, almost existing systems have not been effective enough since these systems have not sufficiently considered the polish process. This paper mainly describes a system that enables the strategic management of research resources together with polish process and their practical use.Keywords: research resource, polish process, information sharing, knowledge management, information visualization
Procedia PDF Downloads 38915938 Role of Process Parameters on Pocket Milling with Abrasive Water Jet Machining Technique
Authors: T. V. K. Gupta, J. Ramkumar, Puneet Tandon, N. S. Vyas
Abstract:
Abrasive Water Jet Machining (AWJM) is an unconventional machining process well known for machining hard to cut materials. The primary research focus on the process was for through cutting and a very limited literature is available on pocket milling using AWJM. The present work is an attempt to use this process for milling applications considering a set of various process parameters. Four different input parameters, which were considered by researchers for part separation, are selected for the above application i.e. abrasive size, flow rate, standoff distance, and traverse speed. Pockets of definite size are machined to investigate surface roughness, material removal rate, and pocket depth. Based on the data available through experiments on SS304 material, it is observed that higher traverse speeds gives a better finish because of reduction in the particle energy density and lower depth is also observed. Increase in the standoff distance and abrasive flow rate reduces the rate of material removal as the jet loses its focus and occurrence of collisions within the particles. ANOVA for individual output parameter has been studied to know the significant process parameters.Keywords: abrasive flow rate, surface finish, abrasive size, standoff distance, traverse speed
Procedia PDF Downloads 30415937 Influence of Ligature Tightening on Bone Fracture Risk in Interspinous Process Surgery
Authors: Dae Kyung Choi, Won Man Park, Kyungsoo Kim, Yoon Hyuk Kim
Abstract:
The interspinous process devices have been recently used due to its advantages such as minimal invasiveness and less subsidence of the implant to the osteoporotic bone. In this paper, we have analyzed the influences of ligature tightening of several interspinous process devices using finite element analysis. Four types of interspinous process implants were inserted to the L3-4 spinal motion segment based on their surgical protocols. Inferior plane of L4 vertebra was fixed and 7.5 Nm of extension moment were applied on superior plane of L3 vertebra with 400N of compressive load along follower load direction and pretension of the ligature. The stability of the spinal unit was high enough than that of intact model. The higher value of pretension in the ligature led the decrease of dynamic stabilization effect in cases of the WallisTM, DiamTM, Viking, and Spear®. The results of present study could be used to evaluate surgical option and validate the biomechanical characteristics of the spinal implants.Keywords: interspinous process device, bone fracture risk, lumbar spine, finite element analysis
Procedia PDF Downloads 40015936 Investigation of Cavitation in a Centrifugal Pump Using Synchronized Pump Head Measurements, Vibration Measurements and High-Speed Image Recording
Authors: Simon Caba, Raja Abou Ackl, Svend Rasmussen, Nicholas E. Pedersen
Abstract:
It is a challenge to directly monitor cavitation in a pump application during operation because of a lack of visual access to validate the presence of cavitation and its form of appearance. In this work, experimental investigations are carried out in an inline single-stage centrifugal pump with optical access. Hence, it gives the opportunity to enhance the value of CFD tools and standard cavitation measurements. Experiments are conducted using two impellers running in the same volute at 3000 rpm and the same flow rate. One of the impellers used is optimized for lower NPSH₃% by its blade design, whereas the other one is manufactured using a standard casting method. The cavitation is detected by pump performance measurements, vibration measurements and high-speed image recordings. The head drop and the pump casing vibration caused by cavitation are correlated with the visual appearance of the cavitation. The vibration data is recorded in an axial direction of the impeller using accelerometers recording at a sample rate of 131 kHz. The vibration frequency domain data (up to 20 kHz) and the time domain data are analyzed as well as the root mean square values. The high-speed recordings, focusing on the impeller suction side, are taken at 10,240 fps to provide insight into the flow patterns and the cavitation behavior in the rotating impeller. The videos are synchronized with the vibration time signals by a trigger signal. A clear correlation between cloud collapses and abrupt peaks in the vibration signal can be observed. The vibration peaks clearly indicate cavitation, especially at higher NPSHA values where the hydraulic performance is not affected. It is also observed that below a certain NPSHA value, the cavitation started in the inlet bend of the pump. Above this value, cavitation occurs exclusively on the impeller blades. The impeller optimized for NPSH₃% does show a lower NPSH₃% than the standard impeller, but the head drop starts at a higher NPSHA value and is more gradual. Instabilities in the head drop curve of the optimized impeller were observed in addition to a higher vibration level. Furthermore, the cavitation clouds on the suction side appear more unsteady when using the optimized impeller. The shape and location of the cavitation are compared to 3D fluid flow simulations. The simulation results are in good agreement with the experimental investigations. In conclusion, these investigations attempt to give a more holistic view on the appearance of cavitation by comparing the head drop, vibration spectral data, vibration time signals, image recordings and simulation results. Data indicates that a criterion for cavitation detection could be derived from the vibration time-domain measurements, which requires further investigation. Usually, spectral data is used to analyze cavitation, but these investigations indicate that the time domain could be more appropriate for some applications.Keywords: cavitation, centrifugal pump, head drop, high-speed image recordings, pump vibration
Procedia PDF Downloads 18015935 Communication Policies of Turkey Related to European Union
Authors: Muhammet Erbay
Abstract:
The phenomenon of communication that has been studied by different disciplines has social, political and economical aspects. The scope of communication has extended from a traditional content to the modern world which is under the control of mass media. Nowadays, thanks to globalization and technological facilities, many companies, public or international institutions take advantage of new communication technologies and overhaul their policies. European Union (EU) is one of the effective institutions in this sphere. It aims to harmonize the communication infrastructure and policies of member countries which have gone through the process of political unification. It is a significant problem for the unification of EU to have legal restrictions or critical differences in communication facilities among countries while technology stands at the center of economic and social life. Therefore, EU institutions place a particular importance to their communication policies. Besides, communication processes have a vital importance in creating a European public opinion in the process of political integration. Based on the evaluation above, the aim of this paper is to analyze the cohesion process of Turkey that tries to take an active role in EU communication policies and has on-going negotiations. This article does not only confine itself to the technical details of communication policies but also aims to evaluate socio-political dimension of the process. Therefore, a corporate review has been featured in the study and Turkey's compliance process in communication policies on European Union has been evaluated by the means of deduction method. Some problematic areas have been identified in compliance process on communication policies such as human rights and minority rights, whereas compliance process on communication infrastructure and technology proceeds effectively.Keywords: communication policies, European Union, integration, Turkey
Procedia PDF Downloads 41115934 Fast and Non-Invasive Patient-Specific Optimization of Left Ventricle Assist Device Implantation
Authors: Huidan Yu, Anurag Deb, Rou Chen, I-Wen Wang
Abstract:
The use of left ventricle assist devices (LVADs) in patients with heart failure has been a proven and effective therapy for patients with severe end-stage heart failure. Due to the limited availability of suitable donor hearts, LVADs will probably become the alternative solution for patient with heart failure in the near future. While the LVAD is being continuously improved toward enhanced performance, increased device durability, reduced size, a better understanding of implantation management becomes critical in order to achieve better long-term blood supplies and less post-surgical complications such as thrombi generation. Important issues related to the LVAD implantation include the location of outflow grafting (OG), the angle of the OG, the combination between LVAD and native heart pumping, uniform or pulsatile flow at OG, etc. We have hypothesized that an optimal implantation of LVAD is patient specific. To test this hypothesis, we employ a novel in-house computational modeling technique, named InVascular, to conduct a systematic evaluation of cardiac output at aortic arch together with other pertinent hemodynamic quantities for each patient under various implantation scenarios aiming to get an optimal implantation strategy. InVacular is a powerful computational modeling technique that integrates unified mesoscale modeling for both image segmentation and fluid dynamics with the cutting-edge GPU parallel computing. It first segments the aortic artery from patient’s CT image, then seamlessly feeds extracted morphology, together with the velocity wave from Echo Ultrasound image of the same patient, to the computation model to quantify 4-D (time+space) velocity and pressure fields. Using one NVIDIA Tesla K40 GPU card, InVascular completes a computation from CT image to 4-D hemodynamics within 30 minutes. Thus it has the great potential to conduct massive numerical simulation and analysis. The systematic evaluation for one patient includes three OG anastomosis (ascending aorta, descending thoracic aorta, and subclavian artery), three combinations of LVAD and native heart pumping (1:1, 1:2, and 1:3), three angles of OG anastomosis (inclined upward, perpendicular, and inclined downward), and two LVAD inflow conditions (uniform and pulsatile). The optimal LVAD implantation is suggested through a comprehensive analysis of the cardiac output and related hemodynamics from the simulations over the fifty-four scenarios. To confirm the hypothesis, 5 random patient cases will be evaluated.Keywords: graphic processing unit (GPU) parallel computing, left ventricle assist device (LVAD), lumped-parameter model, patient-specific computational hemodynamics
Procedia PDF Downloads 13315933 Utilizing Reflection as a Tool for Experiential Learning through a Simulated Activity
Authors: Nadira Zaidi
Abstract:
The aim of this study is to gain direct feedback of interviewees in a simulated interview process. Reflection based on qualitative data analysis has been utilized through the Gibbs Reflective Cycle, with 30 students as respondents at the Undergraduate level. The respondents reflected on the positive and negative aspects of this active learning process in order to increase their performance in actual job interviews. Results indicate that students engaged in the process successfully imbibed the feedback that they received from the interviewers and also identified the areas that needed improvement.Keywords: experiential learning, positive and negative impact, reflection, simulated
Procedia PDF Downloads 143