Search results for: process performance index.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10914

Search results for: process performance index.

924 A Perceptually Optimized Wavelet Embedded Zero Tree Image Coder

Authors: A. Bajit, M. Nahid, A. Tamtaoui, E. H. Bouyakhf

Abstract:

In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

Keywords: DWT, linear-phase 9/7 filter, 9/7 Wavelets Error Sensitivity WES, CSF implementation approaches, JND Just Noticeable Difference, Luminance masking, Contrast masking, standard SPIHT, Objective Quality Measure, Probability Score PS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2033
923 Effect of Cavities on the Behaviour of Strip Footing Subjected to Inclined Load

Authors: Ali A. Al-Jazaairry, Tahsin T. Sabbagh

Abstract:

One of the important concerns within the field of geotechnical engineering is the presence of cavities in soils. This present work is an attempt to understand the behaviour of strip footing subjected to inclined load and constructed on cavitied soil. The failure mechanism of strip footing located above such soils was studied analytically. The capability of analytical model to correctly expect the system behaviour is assessed by carrying out verification analysis on available studies. The study was prepared by finite element software (PLAXIS) in which an elastic-perfectly plastic soil model was used. It was indicated, from the results of the study, that the load carrying capacity of foundation constructed on cavity can be analysed well using such analysis. The research covered many foundation cases, and in each foundation case, there occurs a critical depth under which the presence of cavities has shown minimum impact on the foundation performance. When cavities are found above this critical depth, the load carrying capacity of the foundation differs with many influences, such as the location and size of the cavity and footing depth. Figures involving the load carrying capacity with the affecting factors studied are presented. These figures offer information beneficial for the design of strip footings rested on underground cavities. Moreover, the results might be used to design a shallow foundation constructed on cavitied soil, whereas the obtained failure mechanisms may be employed to improve numerical solutions for this kind of problems.

Keywords: Axial load, cavity, inclined load, strip footing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1272
922 Simulation Based VLSI Implementation of Fast Efficient Lossless Image Compression System Using Adjusted Binary Code & Golumb Rice Code

Authors: N. Muthukumaran, R. Ravi

Abstract:

The Simulation based VLSI Implementation of FELICS (Fast Efficient Lossless Image Compression System) Algorithm is proposed to provide the lossless image compression and is implemented in simulation oriented VLSI (Very Large Scale Integrated). To analysis the performance of Lossless image compression and to reduce the image without losing image quality and then implemented in VLSI based FELICS algorithm. In FELICS algorithm, which consists of simplified adjusted binary code for Image compression and these compression image is converted in pixel and then implemented in VLSI domain. This parameter is used to achieve high processing speed and minimize the area and power. The simplified adjusted binary code reduces the number of arithmetic operation and achieved high processing speed. The color difference preprocessing is also proposed to improve coding efficiency with simple arithmetic operation. Although VLSI based FELICS Algorithm provides effective solution for hardware architecture design for regular pipelining data flow parallelism with four stages. With two level parallelisms, consecutive pixels can be classified into even and odd samples and the individual hardware engine is dedicated for each one. This method can be further enhanced by multilevel parallelisms.

Keywords: Image compression, Pixel, Compression Ratio, Adjusted Binary code, Golumb Rice code, High Definition display, VLSI Implementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2055
921 Numerical Analysis of the Effect of Geocell Reinforcement above Buried Pipes on Surface Settlement and Vertical Pressure

Authors: Waqed H. Almohammed, Mohammed Y. Fattah, Sajjad E. Rasheed

Abstract:

Dynamic traffic loads cause deformation of underground pipes, resulting in vehicle discomfort. This makes it necessary to reinforce the layers of soil above underground pipes. In this study, the subbase layer was reinforced. Finite element software (PLAXIS 3D) was used to in the simulation, which includes geocell reinforcement, vehicle loading, soil layers and Glass Fiber Reinforced Plastic (GRP) pipe. Geocell reinforcement was modeled using a geogrid element, which was defined as a slender structure element that has the ability to withstand axial stresses but not to resist bending. Geogrids cannot withstand compression but they can withstand tensile forces. Comparisons have been made between the numerical models and experimental works, and a good agreement was obtained. Using the mathematical model, the performance of three different pipes of diameter 600 mm, 800 mm, and 1000 mm, and three different vehicular speeds of 20 km/h, 40 km/h, and 60 km/h, was examined to determine their impact on surface settlement and vertical pressure at the pipe crown for two cases: with and without geocell reinforcement. The results showed that, for a pipe diameter of 600 mm under geocell reinforcement, surface settlement decreases by 94 % when the speed of the vehicle is 20 km/h and by 98% when the speed of the vehicle is 60 km/h. Vertical pressure decreases by 81 % when the diameter of the pipe is 600 mm, while the value decreases to 58 % for a pipe with diameter 1000 mm. The results show that geocell reinforcement causes a significant and positive reduction in surface settlement and vertical stress above the pipe crown, leading to an increase in pipe safety.

Keywords: Dynamic loading, geocell reinforcement, GRP pipe, PLAXIS 3D, surface settlement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533
920 A Study to Assess the Energy Saving Potential and Economic Analysis of an Agro Based Industry in Karnataka, India

Authors: Sangamesh G. Sakri, Akash N. Patil, Sadashivappa M. Kotli

Abstract:

Agro based industries in India are considered as the micro, small and medium enterprises (MSME). In India, MSMEs contribute approximately 8 percent of the country’s GDP, 42 percent of the manufacturing output and 40 percent of exports. The toor dal (scientific name Cajanus cajan, commonly known as yellow gram, pigeon pea) is the second largest pulse crop in India accounting for about 20% of total pulse production. The toor dal milling industry in India is one of the major agro-processing industries in the country. Most of the dal mills are concentrated in pulse producing areas, which are spread all over the country. In Karnataka state, Gulbarga is a district, where toor dal is the main crop and is grown extensively. There are more than 500 dal mills in and around the Gulbarga district to process dal. However, the majority of these dal milling units use traditional methods of processing which are energy and capital intensive. There exists a huge energy saving potential in these mills. An energy audit is conducted on a dal mill in Gulbarga to understand the energy consumption pattern to assess the energy saving potential, and an economic analysis is conducted to identify energy conservation opportunities.

Keywords: Conservation, demand side management, load curve, toor dal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509
919 Template-Based Object Detection through Partial Shape Matching and Boundary Verification

Authors: Feng Ge, Tiecheng Liu, Song Wang, Joachim Stahl

Abstract:

This paper presents a novel template-based method to detect objects of interest from real images by shape matching. To locate a target object that has a similar shape to a given template boundary, the proposed method integrates three components: contour grouping, partial shape matching, and boundary verification. In the first component, low-level image features, including edges and corners, are grouped into a set of perceptually salient closed contours using an extended ratio-contour algorithm. In the second component, we develop a partial shape matching algorithm to identify the fractions of detected contours that partly match given template boundaries. Specifically, we represent template boundaries and detected contours using landmarks, and apply a greedy algorithm to search the matched landmark subsequences. For each matched fraction between a template and a detected contour, we estimate an affine transform that transforms the whole template into a hypothetic boundary. In the third component, we provide an efficient algorithm based on oriented edge lists to determine the target boundary from the hypothetic boundaries by checking each of them against image edges. We evaluate the proposed method on recognizing and localizing 12 template leaves in a data set of real images with clutter back-grounds, illumination variations, occlusions, and image noises. The experiments demonstrate the high performance of our proposed method1.

Keywords: Object detection, shape matching, contour grouping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2282
918 The Wider Benefits of Negotiations: Austrian Perspective on Educational Leadership as a ‘Power Game’ for Trade Unions

Authors: Rudolf Egger

Abstract:

This paper explores the relationships between the basic learning processes of leading trade union workers and their methods for coping with the changes in the life-courses of societies today. It will discuss the fragile discourse on lifelong learning in trade unions and the “production of self-techniques” to get in touch with the new economic forms. On the basis of an empirical project, different processes of the socialization of leading trade union workers will be analysed to discover the consequences of the lifelong learning discourse. The results show what competences they need to develop for the “wider benefits of negotiations”. The main challenge remains to make visible how deeply intertwined trade union learning and education are with development in an ongoing dynamic economic process, rather than a quick-fix injection of skills and information. There is a complex relationship existing between the three ‘partners’, work, learning and society forming. The author suggests that contemporary trade unions could be trendsetters who make their own learning agendas by drawing less on formal education and more on informal and non-formal learning contexts. This is in parallel with growing political and scientific consciousness of the need to arrive at new educational/vocational policies and practices.

Keywords: Lifelong learning, Trade unions, Non-formal learning, Educational/vocational policies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1218
917 Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Authors: Anshu Manik, William G. Buttlar, Kasthurirangan Gopalakrishnan

Abstract:

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Keywords: Asphalt Pavement, Risk Analysis, StochasticSimulation, QC/QA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
916 On Algebraic Structure of Improved Gauss-Seidel Iteration

Authors: O. M. Bamigbola, A. A. Ibrahim

Abstract:

Analysis of real life problems often results in linear systems of equations for which solutions are sought. The method to employ depends, to some extent, on the properties of the coefficient matrix. It is not always feasible to solve linear systems of equations by direct methods, as such the need to use an iterative method becomes imperative. Before an iterative method can be employed to solve a linear system of equations there must be a guaranty that the process of solution will converge. This guaranty, which must be determined apriori, involve the use of some criterion expressible in terms of the entries of the coefficient matrix. It is, therefore, logical that the convergence criterion should depend implicitly on the algebraic structure of such a method. However, in deference to this view is the practice of conducting convergence analysis for Gauss- Seidel iteration on a criterion formulated based on the algebraic structure of Jacobi iteration. To remedy this anomaly, the Gauss- Seidel iteration was studied for its algebraic structure and contrary to the usual assumption, it was discovered that some property of the iteration matrix of Gauss-Seidel method is only diagonally dominant in its first row while the other rows do not satisfy diagonal dominance. With the aid of this structure we herein fashion out an improved version of Gauss-Seidel iteration with the prospect of enhancing convergence and robustness of the method. A numerical section is included to demonstrate the validity of the theoretical results obtained for the improved Gauss-Seidel method.

Keywords: Linear system of equations, Gauss-Seidel iteration, algebraic structure, convergence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2911
915 Quality Management in Spice Paprika Production as a Synergy of Internal and External Quality Measures

Authors: É. Kónya, E. Szabó, I. Bata-Vidács, T. Deák, M. Ottucsák, N. Adányi, A. Székács

Abstract:

Spice paprika is a major spice commodity in the European Union (EU), produced locally and imported from non-EU countries, reported not only for chemical and microbiological contamination, but also for fraud. The effective interaction between producers’ quality management practices and government and EU activities is described on the example of spice paprika production and control in Hungary, a country of leading spice paprika producer and per capita consumer in Europe. To demonstrate the importance of various contamination factors in the Hungarian production and EU trade of spice paprika, several aspects concerning food safety of this commodity are presented. Alerts in the Rapid Alert System for Food and Feed (RASFF) of the EU between 2005 and 2013, as well as Hungarian state inspection results on spice paprika in 2004 are discussed, and quality non-compliance claims regarding spice paprika among EU member states are summarized in by means of network analysis. Quality assurance measures established along the spice paprika production technology chain at the leading Hungarian spice paprika manufacturer, Kalocsai Fűszerpaprika Zrt. are surveyed with main critical control points identified. The structure and operation of the Hungarian state food safety inspection system is described. Concerted performance of the latter two quality management systems illustrates the effective interaction between internal (manufacturer) and external (state) quality control measures.

Keywords: Spice paprika, quality control, reporting mechanisms, RASFF, vulnerable points, HACCP, BRC Global Standard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1947
914 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider

Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón

Abstract:

The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.

Keywords: AD0, ALICE, DCS, LHC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
913 Investigating the Effect of Uncertainty on a LP Model of a Petrochemical Complex: Stability Analysis Approach

Authors: Abdallah Al-Shammari

Abstract:

This study discusses the effect of uncertainty on production levels of a petrochemical complex. Uncertainly or variations in some model parameters, such as prices, supply and demand of materials, can affect the optimality or the efficiency of any chemical process. For any petrochemical complex with many plants, there are many sources of uncertainty and frequent variations which require more attention. Many optimization approaches are proposed in the literature to incorporate uncertainty within the model in order to obtain a robust solution. In this work, a stability analysis approach is applied to a deterministic LP model of a petrochemical complex consists of ten plants to investigate the effect of such variations on the obtained optimal production levels. The proposed approach can determinate the allowable variation ranges of some parameters, mainly objective or RHS coefficients, before the system lose its optimality. Parameters with relatively narrow range of variations, i.e. stability limits, are classified as sensitive parameters or constraints that need accurate estimate or intensive monitoring. These stability limits offer easy-to-use information to the decision maker and help in understanding the interaction between some model parameters and deciding when the system need to be re-optimize. The study shows that maximum production of ethylene and the prices of intermediate products are the most sensitive factors that affect the stability of the optimum solution

Keywords: Linear programming, Petrochemicals, stability analysis, uncertainty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934
912 The Effect of Motor Learning Based Computer-Assisted Practice for Children with Handwriting Deficit – Comparing with the Effect of Traditional Sensorimotor Approach

Authors: Shao-Hsia Chang, Nan-Ying Yu

Abstract:

The objective of this study was to test how advanced digital technology enables a more effective training on the handwriting of children with handwriting deficit. This study implemented the graphomotor apparatuses to a computer-assisted instruction system. In a randomized controlled trial, the experiments for verifying the intervention effect were conducted. Forty two children with handwriting deficit were assigned to computer-assisted instruction, sensorimotor training or control (no intervention) group. Handwriting performance was measured using the Elementary reading/writing test and computerized handwriting evaluation before and after 6 weeks of intervention. Analysis of variance of change scores were conducted to show whether statistically significant difference across the three groups. Significant difference was found among three groups. Computer group shows significant difference from the other two groups. Significance was denoted in near-point, far-point copy, dictation test, and writing from phonetic symbols. Writing speed and mean stroke velocity in near-, far-point and short paragraph copy were found significantly difference among three groups. Computer group shows significant improvement from the other groups. For clinicians and school teachers, the results of this study provide a motor control based insight for the improvement of handwriting difficulties.

Keywords: Dysgraphia, computerized handwriting evaluation, sensorimotor program, computer assisted program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2058
911 A New Approach for Image Segmentation using Pillar-Kmeans Algorithm

Authors: Ali Ridho Barakbah, Yasushi Kiyoki

Abstract:

This paper presents a new approach for image segmentation by applying Pillar-Kmeans algorithm. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after optimized by Pillar Algorithm. The Pillar algorithm considers the pillars- placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. This algorithm is able to optimize the K-means clustering for image segmentation in aspects of precision and computation time. It designates the initial centroids- positions by calculating the accumulated distance metric between each data point and all previous centroids, and then selects data points which have the maximum distance as new initial centroids. This algorithm distributes all initial centroids according to the maximum accumulated distance metric. This paper evaluates the proposed approach for image segmentation by comparing with K-means and Gaussian Mixture Model algorithm and involving RGB, HSV, HSL and CIELAB color spaces. The experimental results clarify the effectiveness of our approach to improve the segmentation quality in aspects of precision and computational time.

Keywords: Image segmentation, K-means clustering, Pillaralgorithm, color spaces.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3348
910 System for Monitoring Marine Turtles Using Unstructured Supplementary Service Data

Authors: Luís Pina

Abstract:

The conservation of marine biodiversity keeps ecosystems in balance and ensures the sustainable use of resources. In this context, technological resources have been used for monitoring marine species to allow biologists to obtain data in real-time. There are different mobile applications developed for data collection for monitoring purposes, but these systems are designed to be utilized only on third-generation (3G) phones or smartphones with Internet access and in rural parts of the developing countries, Internet services and smartphones are scarce. Thus, the objective of this work is to develop a system to monitor marine turtles using Unstructured Supplementary Service Data (USSD), which users can access through basic mobile phones. The system aims to improve the data collection mechanism and enhance the effectiveness of current systems in monitoring sea turtles using any type of mobile device without Internet access. The system will be able to report information related to the biological activities of marine turtles. Also, it will be used as a platform to assist marine conservation entities to receive reports of illegal sales of sea turtles. The system can also be utilized as an educational tool for communities, providing knowledge and allowing the inclusion of communities in the process of monitoring marine turtles. Therefore, this work may contribute with information to decision-making and implementation of contingency plans for marine conservation programs.

Keywords: GSM, marine biology, marine turtles, USSD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911
909 Brief Review of the Self-Tightening, Left-Handed Thread

Authors: Robert S. Giachetti, Emanuele Grossi

Abstract:

Loosening of bolted joints in rotating machines can adversely affect their performance, cause mechanical damage, and lead to injuries. In this paper, two potential loosening phenomena in rotating applications are discussed. First, ‘precession,’ is governed by thread/nut contact forces, while the second is based on inertial effects of the fastened assembly. These mechanisms are reviewed within the context of historical usage of left-handed fasteners in rotating machines which appears absent in the literature and common machine design texts. Historically, to prevent loosening of wheel nuts, vehicle manufacturers have used right-handed and left-handed threads on different sides of the vehicle, but most modern vehicles have abandoned this custom and only use right-handed, tapered lug nuts on all sides of the vehicle. Other classical machines such as the bicycle continue to use different handed threads on each side while other machines such as, bench grinders, circular saws and brush cutters still use left-handed threads to fasten rotating components. Despite the continued use of left-handed fasteners, the rationale and analysis of left-handed threads to mitigate self-loosening of fasteners in rotating applications is not commonly, if at all, discussed in the literature or design textbooks. Without scientific literature to support these design selections, these implementations may be the result of experimental findings or aged institutional knowledge. Based on a review of rotating applications, historical documents and mechanical design references, a formal study of the paradoxical nature of left-handed threads in various applications is merited.

Keywords: Rotating machinery, self-loosening fasteners, wheel fastening, vibration loosening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 526
908 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis

Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha

Abstract:

Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.

Keywords: Shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 801
907 RV-YOLOX: Object Detection on Inland Waterways Based on Optimized YOLOX through Fusion of Vision and 3+1D Millimeter Wave Radar

Authors: Zixian Zhang, Shanliang Yao, Zile Huang, Zhaodong Wu, Xiaohui Zhu, Yong Yue, Jieming Ma

Abstract:

Unmanned Surface Vehicles (USVs) hold significant value for their capacity to undertake hazardous and labor-intensive operations over aquatic environments. Object detection tasks are significant in these applications. Nonetheless, the efficacy of USVs in object detection is impeded by several intrinsic challenges, including the intricate dispersal of obstacles, reflections emanating from coastal structures, and the presence of fog over water surfaces, among others. To address these problems, this paper provides a fusion method for USVs to effectively detect objects in the inland surface environment, utilizing vision sensors and 3+1D Millimeter-wave radar. The MMW radar is a complementary tool to vision sensors, offering reliable environmental data. This approach involves the conversion of the radar’s 3D point cloud into a 2D radar pseudo-image, thereby standardizing the format for radar and vision data by leveraging a point transformer. Furthermore, this paper proposes the development of a multi-source object detection network, named RV-YOLOX, which leverages radar-vision integration specifically tailored for inland waterway environments. The performance is evaluated on our self-recording waterways dataset. Compared with the YOLOX network, our fusion network significantly improves detection accuracy, especially for objects with bad light conditions.

Keywords: Inland waterways, object detection, YOLO, sensor fusion, self-attention, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195
906 Comparison between Separable and Irreducible Goppa Code in McEliece Cryptosystem

Authors: Thuraya M. Qaradaghi, Newroz N. Abdulrazaq

Abstract:

The McEliece cryptosystem is an asymmetric type of cryptography based on error correction code. The classical McEliece used irreducible binary Goppa code which considered unbreakable until now especially with parameter [1024, 524, and 101], but it is suffering from large public key matrix which leads to be difficult to be used practically. In this work Irreducible and Separable Goppa codes have been introduced. The Irreducible and Separable Goppa codes used are with flexible parameters and dynamic error vectors. A Comparison between Separable and Irreducible Goppa code in McEliece Cryptosystem has been done. For encryption stage, to get better result for comparison, two types of testing have been chosen; in the first one the random message is constant while the parameters of Goppa code have been changed. But for the second test, the parameters of Goppa code are constant (m=8 and t=10) while the random message have been changed. The results show that the time needed to calculate parity check matrix in separable are higher than the one for irreducible McEliece cryptosystem, which is considered expected results due to calculate extra parity check matrix in decryption process for g2(z) in separable type, and the time needed to execute error locator in decryption stage in separable type is better than the time needed to calculate it in irreducible type. The proposed implementation has been done by Visual studio C#.

Keywords: McEliece cryptosystem, Goppa code, separable, irreducible.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2197
905 An Efficient Backward Semi-Lagrangian Scheme for Nonlinear Advection-Diffusion Equation

Authors: Soyoon Bak, Sunyoung Bu, Philsu Kim

Abstract:

In this paper, a backward semi-Lagrangian scheme combined with the second-order backward difference formula is designed to calculate the numerical solutions of nonlinear advection-diffusion equations. The primary aims of this paper are to remove any iteration process and to get an efficient algorithm with the convergence order of accuracy 2 in time. In order to achieve these objects, we use the second-order central finite difference and the B-spline approximations of degree 2 and 3 in order to approximate the diffusion term and the spatial discretization, respectively. For the temporal discretization, the second order backward difference formula is applied. To calculate the numerical solution of the starting point of the characteristic curves, we use the error correction methodology developed by the authors recently. The proposed algorithm turns out to be completely iteration free, which resolves the main weakness of the conventional backward semi-Lagrangian method. Also, the adaptability of the proposed method is indicated by numerical simulations for Burgers’ equations. Throughout these numerical simulations, it is shown that the numerical results is in good agreement with the analytic solution and the present scheme offer better accuracy in comparison with other existing numerical schemes.

Keywords: Semi-Lagrangian method, Iteration free method, Nonlinear advection-diffusion equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2468
904 Maya Semantic Technique: A Mathematical Technique Used to Determine Partial Semantics for Declarative Sentences

Authors: Marcia T. Mitchell

Abstract:

This research uses computational linguistics, an area of study that employs a computer to process natural language, and aims at discerning the patterns that exist in declarative sentences used in technical texts. The approach is mathematical, and the focus is on instructional texts found on web pages. The technique developed by the author and named the MAYA Semantic Technique is used here and organized into four stages. In the first stage, the parts of speech in each sentence are identified. In the second stage, the subject of the sentence is determined. In the third stage, MAYA performs a frequency analysis on the remaining words to determine the verb and its object. In the fourth stage, MAYA does statistical analysis to determine the content of the web page. The advantage of the MAYA Semantic Technique lies in its use of mathematical principles to represent grammatical operations which assist processing and accuracy if performed on unambiguous text. The MAYA Semantic Technique is part of a proposed architecture for an entire web-based intelligent tutoring system. On a sample set of sentences, partial semantics derived using the MAYA Semantic Technique were approximately 80% accurate. The system currently processes technical text in one domain, namely Cµ programming. In this domain all the keywords and programming concepts are known and understood.

Keywords: Natural language understanding, computational linguistics, knowledge representation, linguistic theories.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
903 Methane Production from Biomedical Waste (Blood)

Authors: Fatima M. Kabbashi, Abdalla M. Abdalla, Hussam K. Hamad, Elias S. Hassan

Abstract:

This study investigates the production of renewable energy (biogas) from biomedical hazard waste (blood) and eco-friendly disposal. Biogas is produced by the bacterial anaerobic digestion of biomaterial (blood). During digestion process bacterial feeding result in breaking down chemical bonds of the biomaterial and changing its features, by the end of the digestion (biogas production) the remains become manure as known. That has led to the economic and eco-friendly disposal of hazard biomedical waste (blood). The samples (Whole blood, Red blood cells 'RBCs', Blood platelet and Fresh Frozen Plasma ‘FFP’) are collected and measured in terms of carbon to nitrogen C/N ratio and total solid, then filled in connected flasks (three flasks) using water displacement method. The results of trails showed that the platelet and FFP failed to produce flammable gas, but via a gas analyzer, it showed the presence of the following gases: CO, HC, CO₂, and NOX. Otherwise, the blood and RBCs produced flammable gases: Methane-nitrous CH₃NO (99.45%), which has a blue color flame and carbon dioxide CO₂ (0.55%), which has red/yellow color flame. Methane-nitrous is sometimes used as fuel for rockets, some aircraft and racing cars.

Keywords: Renewable energy, biogas, biomedical waste, blood, anaerobic digestion, eco-friendly disposal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1190
902 Validation of SWAT Model for Prediction of Water Yield and Water Balance: Case Study of Upstream Catchment of Jebba Dam in Nigeria

Authors: Adeniyi G. Adeogun, Bolaji F. Sule, Adebayo W. Salami, Michael O. Daramola

Abstract:

Estimation of water yield and water balance in a river catchment is critical to the sustainable management of water resources at watershed level in any country. Therefore, in the present study, Soil and Water Assessment Tool (SWAT) interfaced with Geographical Information System (GIS) was applied as a tool to predict water balance and water yield of a catchment area in Nigeria. The catchment area, which was 12,992km2, is located upstream Jebba hydropower dam in North central part of Nigeria. In this study, data on the observed flow were collected and compared with simulated flow using SWAT. The correlation between the two data sets was evaluated using statistical measures, such as, Nasch-Sucliffe Efficiency (NSE) and coefficient of determination (R2). The model output shows a good agreement between the observed flow and simulated flow as indicated by NSE and R2, which were greater than 0.7 for both calibration and validation period. A total of 42,733 mm of water was predicted by the calibrated model as the water yield potential of the basin for a simulation period between 1985 to 2010. This interesting performance obtained with SWAT model suggests that SWAT model could be a promising tool to predict water balance and water yield in sustainable management of water resources. In addition, SWAT could be applied to other water resources in other basins in Nigeria as a decision support tool for sustainable water management in Nigeria.

Keywords: GIS, Modeling, Sensitivity Analysis, SWAT, Water Yield, Watershed level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5014
901 Statistical Feature Extraction Method for Wood Species Recognition System

Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof

Abstract:

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Keywords: Classification, fuzzy, inspection system, image analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
900 Optimal Efficiency Control of Pulse Width Modulation - Inverter Fed Motor Pump Drive Using Neural Network

Authors: O. S. Ebrahim, M. A. Badr, A. S. Elgendy, K. O. Shawky, P. K. Jain

Abstract:

This paper demonstrates an improved Loss Model Control (LMC) for a 3-phase induction motor (IM) driving pump load. Compared with other power loss reduction algorithms for IM, the presented one has the advantages of fast and smooth flux adaptation, high accuracy, and versatile implementation. The performance of LMC depends mainly on the accuracy of modeling the motor drive and losses. A loss-model for IM drive that considers the surplus power loss caused by inverter voltage harmonics using closed-form equations and also includes the magnetic saturation has been developed. Further, an Artificial Neural Network (ANN) controller is synthesized and trained offline to determine the optimal flux level that achieves maximum drive efficiency. The drive’s voltage and speed control loops are connecting via the stator frequency to avoid the possibility of excessive magnetization. Besides, the resistance change due to temperature is considered by a first-order thermal model. The obtained thermal information enhances motor protection and control. These together have the potential of making the proposed algorithm reliable. Simulation and experimental studies are performed on 5.5 kW test motor using the proposed control method. The test results are provided and compared with the fixed flux operation to validate the effectiveness.

Keywords: Artificial neural network, ANN, efficiency optimization, induction motor, IM, Pulse Width Modulated, PWM, harmonic losses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 321
899 How Does Psychoanalysis Help in Reconstructing Political Thought? An Exercise of Interpretation

Authors: Subramaniam Chandran

Abstract:

The significance of psychology in studying politics is embedded in philosophical issues as well as behavioural pursuits. For the former is often associated with Sigmund Freud and his followers. The latter is inspired by the writings of Harold Lasswell. Political psychology or psychopolitics has its own impression on political thought ever since it deciphers the concept of human nature and political propaganda. More importantly, psychoanalysis views political thought as a textual content which needs to explore the latent from the manifest content. In other words, it reads the text symptomatically and interprets the hidden truth. This paper explains the paradigm of dream interpretation applied by Freud. The dream work is a process which has four successive activities: condensation, displacement, representation and secondary revision. The texts dealing with political though can also be interpreted on these principles. Freud's method of dream interpretation draws its source after the hermeneutic model of philological research. It provides theoretical perspective and technical rules for the interpretation of symbolic structures. The task of interpretation remains a discovery of equivalence of symbols and actions through perpetual analogies. Psychoanalysis can help in studying political thought in two ways: to study the text distortion, Freud's dream interpretation is used as a paradigm exploring the latent text from its manifest text; and to apply Freud's psychoanalytic concepts and theories ranging from individual mind to civilization, religion, war and politics.

Keywords: Psychoanalysis, political thought, dreaminterpretation, latent content, manifest content

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1544
898 Participation in IAEA Proficiency Test to Analyse Cobalt, Strontium and Caesium in Seawater Using Direct Counting and Radiochemical Techniques

Authors: S. Visetpotjanakit, C. Khrautongkieo

Abstract:

Radiation monitoring in the environment and foodstuffs is one of the main responsibilities of Office of Atoms for Peace (OAP) as the nuclear regulatory body of Thailand. The main goal of the OAP is to assure the safety of the Thai people and environment from any radiological incidents. Various radioanalytical methods have been developed to monitor radiation and radionuclides in the environmental and foodstuff samples. To validate our analytical performance, several proficiency test exercises from the International Atomic Energy Agency (IAEA) have been performed. Here, the results of a proficiency test exercise referred to as the Proficiency Test for Tritium, Cobalt, Strontium and Caesium Isotopes in Seawater 2017 (IAEA-RML-2017-01) are presented. All radionuclides excepting ³H were analysed using various radioanalytical methods, i.e. direct gamma-ray counting for determining ⁶⁰Co, ¹³⁴Cs and ¹³⁷Cs and developed radiochemical techniques for analysing ¹³⁴Cs, ¹³⁷Cs using AMP pre-concentration technique and 90Sr using di-(2-ethylhexyl) phosphoric acid (HDEHP) liquid extraction technique. The analysis results were submitted to IAEA. All results passed IAEA criteria, i.e. accuracy, precision and trueness and obtained ‘Accepted’ statuses. These confirm the data quality from the OAP environmental radiation laboratory to monitor radiation in the environment.

Keywords: International atomic energy agency, proficiency test, radiation monitoring, seawater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797
897 Experiment Study on the Influence of Tool Materials on the Drilling of Thick Stacked Plate of 2219 Aluminum Alloy

Authors: G. H. Li, M. Liu, H. J. Qi, Q. Zhu, W. Z. He

Abstract:

The drilling and riveting processes are widely used in the assembly of carrier rocket, which makes the efficiency and quality of drilling become the important factor affecting the assembly process. According to the problem existing in the drilling of thick stacked plate (thickness larger than 10mm) of carrier rocket, such as drill break, large noise and burr etc., experimental study of the influence of tool material on the drilling was carried out. The cutting force was measured by a piezoelectric dynamometer, the aperture was measured with an outline projector, and the burr is observed and measured by a digital stereo microscope. Through the measurement, the effects of tool material on the drilling were analyzed from the aspects of drilling force, diameter, and burr. The results show that, compared with carbide drill and coated carbide one, the drilling force of high speed steel is larger. But, the application of high speed steel also has some advantages, e.g. a higher number of hole can be obtained, the height of burr is small, the exit is smooth and the slim burr is less, and the tool experiences wear but not fracture. Therefore, the high speed steel tool is suitable for the drilling of thick stacked plate of 2219 Aluminum alloy.

Keywords: 2219 aluminum alloy, thick stacked plate, drilling, tool material.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1266
896 Roundabout Optimal Entry and Circulating Flow Induced by Road Hump

Authors: Amir Hossein Pakshir, A. Hossein Pour, N. Jahandar, Ali Paydar

Abstract:

Roundabout work on the principle of circulation and entry flows, where the maximum entry flow rates depend largely on circulating flow bearing in mind that entry flows must give away to circulating flows. Where an existing roundabout has a road hump installed at the entry arm, it can be hypothesized that the kinematics of vehicles may prevent the entry arm from achieving optimum performance. Road humps are traffic calming devices placed across road width solely as speed reduction mechanism. They are the preferred traffic calming option in Malaysia and often used on single and dual carriageway local routes. The speed limit on local routes is 30mph (50 km/hr). Road humps in their various forms achieved the biggest mean speed reduction (based on a mean speed before traffic calming of 30mph) of up to 10mph or 16 km/hr according to the UK Department of Transport. The underlying aim of reduced speed should be to achieve a 'safe' distribution of speeds which reflects the function of the road and the impacts on the local community. Constraining safe distribution of speeds may lead to poor drivers timing and delayed reflex reaction that can probably cause accident. Previous studies on road hump impact have focused mainly on speed reduction, traffic volume, noise and vibrations, discomfort and delay from the use of road humps. The paper is aimed at optimal entry and circulating flow induced by road humps. Results show that roundabout entry and circulating flow perform better in circumstances where there is no road hump at entrance.

Keywords: Road hump, Roundabout, Speed Reduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2990
895 A Methodology for Automatic Diversification of Document Categories

Authors: Dasom Kim, Chen Liu, Myungsu Lim, Soo-Hyeon Jeon, Byeoung Kug Jeon, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, numerous documents including large volumes of unstructured data and text have been created because of the rapid increase in the use of social media and the Internet. Usually, these documents are categorized for the convenience of users. Because the accuracy of manual categorization is not guaranteed, and such categorization requires a large amount of time and incurs huge costs. Many studies on automatic categorization have been conducted to help mitigate the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorize complex documents with multiple topics because they work on the assumption that individual documents can be categorized into single categories only. Therefore, to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, the learning process employed in these studies involves training using a multi-categorized document set. These methods therefore cannot be applied to the multi-categorization of most documents unless multi-categorized training sets using traditional multi-categorization algorithms are provided. To overcome this limitation, in this study, we review our novel methodology for extending the category of a single-categorized document to multiple categorizes, and then introduce a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.

Keywords: Big Data Analysis, Document Classification, Text Mining, Topic Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735