Search results for: Systems Approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8475

Search results for: Systems Approach

825 A Security Model of Voice Eavesdropping Protection over Digital Networks

Authors: Supachai Tangwongsan, Sathaporn Kassuvan

Abstract:

The purpose of this research is to develop a security model for voice eavesdropping protection over digital networks. The proposed model provides an encryption scheme and a personal secret key exchange between communicating parties, a so-called voice data transformation system, resulting in a real-privacy conversation. The operation of this system comprises two main steps as follows: The first one is the personal secret key exchange for using the keys in the data encryption process during conversation. The key owner could freely make his/her choice in key selection, so it is recommended that one should exchange a different key for a different conversational party, and record the key for each case into the memory provided in the client device. The next step is to set and record another personal option of encryption, either taking all frames or just partial frames, so-called the figure of 1:M. Using different personal secret keys and different sets of 1:M to different parties without the intervention of the service operator, would result in posing quite a big problem for any eavesdroppers who attempt to discover the key used during the conversation, especially in a short period of time. Thus, it is quite safe and effective to protect the case of voice eavesdropping. The results of the implementation indicate that the system can perform its function accurately as designed. In this regard, the proposed system is suitable for effective use in voice eavesdropping protection over digital networks, without any requirements to change presently existing network systems, mobile phone network and VoIP, for instance.

Keywords: Computer Security, Encryption, Key Exchange, Security Model, Voice Eavesdropping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
824 Detection and Classification of Faults on Parallel Transmission Lines Using Wavelet Transform and Neural Network

Authors: V.S.Kale, S.R.Bhide, P.P.Bedekar, G.V.K.Mohan

Abstract:

The protection of parallel transmission lines has been a challenging task due to mutual coupling between the adjacent circuits of the line. This paper presents a novel scheme for detection and classification of faults on parallel transmission lines. The proposed approach uses combination of wavelet transform and neural network, to solve the problem. While wavelet transform is a powerful mathematical tool which can be employed as a fast and very effective means of analyzing power system transient signals, artificial neural network has a ability to classify non-linear relationship between measured signals by identifying different patterns of the associated signals. The proposed algorithm consists of time-frequency analysis of fault generated transients using wavelet transform, followed by pattern recognition using artificial neural network to identify the type of the fault. MATLAB/Simulink is used to generate fault signals and verify the correctness of the algorithm. The adaptive discrimination scheme is tested by simulating different types of fault and varying fault resistance, fault location and fault inception time, on a given power system model. The simulation results show that the proposed scheme for fault diagnosis is able to classify all the faults on the parallel transmission line rapidly and correctly.

Keywords: Artificial neural network, fault detection and classification, parallel transmission lines, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3013
823 Surrogate based Evolutionary Algorithm for Design Optimization

Authors: Maumita Bhattacharya

Abstract:

Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.

Keywords: Evolutionary algorithm, Fitness function, Optimization, Meta-model, Stochastic method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1579
822 Ecolabeling and Green Certification for Effective Fisheries Management – An Analysis

Authors: A. Ramachandran

Abstract:

Nowadays there is a growing environmental concern and the business communities have slowly started recognising environmental protection and sustainable utilization of natural resources into their marketing strategies. This paper discusses the various Ecolabeling and Certification Systems developed world over to regulate and introduce Fair Trade in Ornamental Fish Industry. Ecolabeling and green certification are considered as part of these strategies implemented partly out of compulsion from the National and International Regulatory Bodies and Environmental Movements. All the major markets of ornamental fishes like European Union, USA and Japan have started putting restrictions on the trade to impose ecolabeling as a non tariff barrier like the one imposed on seafood and aqua cultured products. A review was done on the available Ecolabeling and Green Certification Schemes available at local, national and international levels for fisheries including aquaculture and ornamental fish trade and to examine the success and constraints faced by these schemes during its implementation. The primary downside of certification is the multiplicity of ecolabels and cost incurred by applicants for certification, costs which may in turn be passed on to consumers. The studies reveal serious inadequacies in a number of ecolabels and cast doubt on their overall contribution to effective fisheries management and sustainability. The paper also discusses the inititive taken in India to develop guidelines for Green Certification of Fresh water ornamental fishes.

Keywords: Ecolabeling in fisheries, Fair trade, Green Certification, Sustainable Ornamental fish trade.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2590
821 A Software Framework for Predicting Oil-Palm Yield from Climate Data

Authors: Mohd. Noor Md. Sap, A. Majid Awan

Abstract:

Intelligent systems based on machine learning techniques, such as classification, clustering, are gaining wide spread popularity in real world applications. This paper presents work on developing a software system for predicting crop yield, for example oil-palm yield, from climate and plantation data. At the core of our system is a method for unsupervised partitioning of data for finding spatio-temporal patterns in climate data using kernel methods which offer strength to deal with complex data. This work gets inspiration from the notion that a non-linear data transformation into some high dimensional feature space increases the possibility of linear separability of the patterns in the transformed space. Therefore, it simplifies exploration of the associated structure in the data. Kernel methods implicitly perform a non-linear mapping of the input data into a high dimensional feature space by replacing the inner products with an appropriate positive definite function. In this paper we present a robust weighted kernel k-means algorithm incorporating spatial constraints for clustering the data. The proposed algorithm can effectively handle noise, outliers and auto-correlation in the spatial data, for effective and efficient data analysis by exploring patterns and structures in the data, and thus can be used for predicting oil-palm yield by analyzing various factors affecting the yield.

Keywords: Pattern analysis, clustering, kernel methods, spatial data, crop yield

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
820 The Significance of Awareness about Gender Diversity for the Future of Work: A Multi-Method Study of Organizational Structures and Policies Considering Trans and Gender Diversity

Authors: Robin C. Ladwig

Abstract:

The future of work becomes less predictable which requires increasing adaptability of organizations to social and work changes. Society is transforming regarding gender identity in the sense that more people come forward to identify as trans and gender diverse (TGD). Organizations are ill-equipped to provide a safe and encouraging work environment by lacking inclusive organizational structures. The qualitative multi-method research about TGD inclusivity in the workplace explores the enablers and barriers for TGD individuals to satisfactorily engage in the work environment and organizational culture. Furthermore, these TGD insights are analyzed based on organizational implications and awareness from a leadership and management perspective. The semi-structured online interviews with TGD individuals and the photo-elicit open-ended questionnaire addressed to leadership and management in diversity, career development, and human resources have been analyzed with a critical grounded theory approach. Findings demonstrated the significance of TGD voices, the support of leadership and management, as well as the synergy between voices and leadership. Hence, it indicates practical implications such as the revision of exclusive language used in policies, data collection, or communication and reconsideration of organizational decision-making by leaders to include TGD voices.

Keywords: Future of work, occupational identity, organizational decision-making, trans and gender diverse identity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 471
819 A Framework for Improving Trade Contractors’ Productivity Tracking Methods

Authors: Sophia Hayes, Kenny L. Liang, Sahil Sharma, Austin Shema, Mahmoud Bader, Mohamed Elbarkouky

Abstract:

Despite being one of the most significant economic contributors of the country, Canada’s construction industry is lagging behind other sectors when it comes to labor productivity improvements. The construction industry is very collaborative as a general contractor, will hire trade contractors to perform most of a project’s work; meaning low productivity from one contractor can have a domino effect on the shared success of a project. To address this issue and encourage trade contractors to improve their productivity tracking methods, an investigative study was done on the productivity views and tracking methods of various trade contractors. Additionally, an in-depth review was done on four standard tracking methods used in the construction industry: cost codes, benchmarking, the job productivity measurement (JPM) standard, and WorkFace Planning (WFP). The four tracking methods were used as a baseline in comparing the trade contractors’ responses, determining gaps within their current tracking methods, and for making improvement recommendations. 15 interviews were conducted with different trades to analyze how contractors value productivity. The results of these analyses indicated that there seem to be gaps within the construction industry when it comes to an understanding of the purpose and value in productivity tracking. The trade contractors also shared their current productivity tracking systems; which were then compared to the four standard tracking methods used in the construction industry. Gaps were identified in their various tracking methods and using a framework; recommendations were made based on the type of trade on how to improve how they track productivity.

Keywords: Trade contractors’ productivity, productivity tracking, cost codes, benchmarking, job productivity measurement, JPM, workface planning WFP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
818 A Mobile Multihop Relay Dynamic TDD Scheme for Cellular Networks

Authors: Jong-Moon Chung, Hyung-Weon Cho, Ki-Yong Jin, Min-Hee Cho

Abstract:

In this paper, we present an analytical framework for the evaluation of the uplink performance of multihop cellular networks based on dynamic time division duplex (TDD). New wireless broadband protocols, such as WiMAX, WiBro, and 3G-LTE apply TDD, and mobile communication protocols under standardization (e.g., IEEE802.16j) are investigating mobile multihop relay (MMR) as a future technology. In this paper a novel MMR TDD scheme is presented, where the dynamic range of the frame is shared to traffic resources of asymmetric nature and multihop relaying. The mobile communication channel interference model comprises of inner and co-channel interference (CCI). The performance analysis focuses on the uplink due to the fact that the effects of dynamic resource allocation show significant performance degradation only in the uplink compared to time division multiple access (TDMA) schemes due to CCI [1-3], where the downlink results to be the same or better.The analysis was based on the signal to interference power ratio (SIR) outage probability of dynamic TDD (D-TDD) and TDMA systems,which are the most widespread mobile communication multi-user control techniques. This paper presents the uplink SIR outage probability with multihop results and shows that the dynamic TDD scheme applying MMR can provide a performance improvement compared to single hop applications if executed properly.

Keywords: Co-Channel Interference, Dynamic TDD, MobileMultihop Reply, Cellular Network, Time Division Multiple Access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2347
817 Competency-Based Social Work Practice and Challenges in Child Case Management: Studies in the Districts Social Welfare Services, Malaysia

Authors: S. Brahim, M. S. Mohamad, E. Zakaria, N. Sarnon@Kusenin

Abstract:

This study aimed to explore the practical experience of child welfare caseworkers and professionalism in child case management in Malaysia. This paper discussed the specific social work practice competency and the challenges faced by child caseworkers in the fieldwork. This research was qualitative with grounded theory approach. Four sessions of focused group discussion (FGD) were conducted involving a total of 27 caseworkers (child protector and probation officers) in the Klang Valley. The study found that the four basic principles of knowledge in child case management namely: 1. knowledge in child case management; 2. professional values of caseworkers towards children; 3. skills in managing cases; and 4. culturally competent practice in child case management. In addition, major challenges faced by the child case manager are the capacity and commitment of the family in children’s rehabilitation program, the credibility of caseworkers are being challenged, and the challenges of support system from intra and interagency. This study is important for policy makers to take into account the capacity and the needs of the child’s caseworker in accordance with the national social work competency framework. It is expected that case management services for children will improve systematically in line with national standards.

Keywords: Social work practice, child case management, competency-based knowledge, and professionalism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2910
816 Safe and Efficient Deep Reinforcement Learning Control Model: A Hydroponics Case Study

Authors: Almutasim Billa A. Alanazi, Hal S. Tharp

Abstract:

Safe performance and efficient energy consumption are essential factors for designing a control system. This paper presents a reinforcement learning (RL) model that can be applied to control applications to improve safety and reduce energy consumption. As hardware constraints and environmental disturbances are imprecise and unpredictable, conventional control methods may not always be effective in optimizing control designs. However, RL has demonstrated its value in several artificial intelligence (AI) applications, especially in the field of control systems. The proposed model intelligently monitors a system's success by observing the rewards from the environment, with positive rewards counting as a success when the controlled reference is within the desired operating zone. Thus, the model can determine whether the system is safe to continue operating based on the designer/user specifications, which can be adjusted as needed. Additionally, the controller keeps track of energy consumption to improve energy efficiency by enabling the idle mode when the controlled reference is within the desired operating zone, thus reducing the system energy consumption during the controlling operation. Water temperature control for a hydroponic system is taken as a case study for the RL model, adjusting the variance of disturbances to show the model’s robustness and efficiency. On average, the model showed safety improvement by up to 15% and energy efficiency improvements by 35%-40% compared to a traditional RL model.

Keywords: Control system, hydroponics, machine learning, reinforcement learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 224
815 Vitamin D Deficiency and Insufficiency in Postmenopausal Women with Obesity

Authors: Vladyslav Povoroznyuk, Anna Musiienko, Nataliia Dzerovych, Roksolana Povoroznyuk, Oksana Ivanyk

Abstract:

Deficiency and insufficiency of Vitamin D is a pandemic of the 21st century. Obesity patients have a lower level of vitamin D, but the literature data are contradictory. The purpose of this study is to investigate deficiency and insufficiency vitamin D in postmenopausal women with obesity. We examined 1007 women aged 50-89 years. Mean age was 65.74±8.61 years; mean height was 1.61±0.07 m; mean weight was 70.65±13.50 kg; mean body mass index was 27.27±4.86 kg/m2, and mean 25(OH) D levels in serum was 26.00±12.00 nmol/l. The women were divided into the following six groups depending on body mass index: I group – 338 women with normal body weight, II group – 16 women with insufficient body weight, III group – 382 women with excessive body weight, IV group – 199 women with obesity of class I, V group – 60 women with obesity of class II, and VI group – 12 women with obesity of class III. Level of 25(OH)D in serum was measured by means of an electrochemiluminescent method - Elecsys 2010 analyzer (Roche Diagnostics, Germany) and cobas test-systems. 34.4% of the examined women have deficiency of vitamin D and 31.4% insufficiency. Women with obesity of class I (23.60±10.24 ng/ml) and obese of class II (22.38±10.34 ng/ml) had significantly lower levels of 25 (OH) D compared to women with normal body weight (28.24±12.99 ng/ml), p=0.00003. In women with obesity, BMI significantly influences vitamin D level, and this influence does not depend on the season.

Keywords: Obesity, body mass index, vitamin D deficiency/insufficiency, postmenopausal women, age.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1061
814 A Study on the Attractiveness of Heavy Duty Motorcycle

Authors: Kaishuan Shen, Pan Changyu, Yuhsiang Lu, Zongshao Liu, Chishxsin Chuang, Minyuan Ma

Abstract:

The culture of riding heavy motorcycles originates from advanced countries and mainly comes from Europe, North America, and Japan. Heavy duty motorcycle riders are different from people who view motorcycles as a convenient mean of transportation. They regard riding them as a kind of enjoyment and high-level taste. The activities of riding heavy duty motorcycles have formes a distinctive landscape in domestic land in Taiwan. Previous studies which explored motorcycle culture in Taiwan still focused on the objects of motorcycle engine displacement under 50 cc.. The study aims to study the heavy duty motorcycles of engine displacement over 550 cc. and explores where their attractiveness is. For finding the attractiveness of heavy duty motorcycle, the study chooses Miryoku Engineering (Preference-Based Design) approach. Two steps are adopted to proceed the research. First, through arranging the letters obtained from interviewing experts, EGM (The Evaluation Grid Method) was applied to find out the structure of attractiveness. The attractive styles are eye-dazzling, leisure, classic, and racing competitive styles. Secondarily, Quantification Theory Type I analysis was adopted as a tool for analyzing the importance of attractiveness. The relationship between style and attractive parts was also discussed. The results could contribute to the design and research development of heavy duty motorcycle industry in Taiwan.

Keywords: attractiveness, evaluation, heavy dutymotorcycle, miryoku engineering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
813 A Model to Study the Effect of Excess Buffers and Na+ Ions on Ca2+ Diffusion in Neuron Cell

Authors: Vikas Tewari, Shivendra Tewari, K. R. Pardasani

Abstract:

Calcium is a vital second messenger used in signal transduction. Calcium controls secretion, cell movement, muscular contraction, cell differentiation, ciliary beating and so on. Two theories have been used to simplify the system of reaction-diffusion equations of calcium into a single equation. One is excess buffer approximation (EBA) which assumes that mobile buffer is present in excess and cannot be saturated. The other is rapid buffer approximation (RBA), which assumes that calcium binding to buffer is rapid compared to calcium diffusion rate. In the present work, attempt has been made to develop a model for calcium diffusion under excess buffer approximation in neuron cells. This model incorporates the effect of [Na+] influx on [Ca2+] diffusion,variable calcium and sodium sources, sodium-calcium exchange protein, Sarcolemmal Calcium ATPase pump, sodium and calcium channels. The proposed mathematical model leads to a system of partial differential equations which have been solved numerically using Forward Time Centered Space (FTCS) approach. The numerical results have been used to study the relationships among different types of parameters such as buffer concentration, association rate, calcium permeability.

Keywords: Excess buffer approximation, Na+ influx, sodium calcium exchange protein, sarcolemmal calcium atpase pump, forward time centred space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
812 A Life Cycle Assessment (LCA) of Aluminum Production Process

Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour

Abstract:

The production of aluminum alloys and ingots – starting from the processing of alumina to aluminum, and the final cast product – was studied using a Life Cycle Assessment (LCA) approach. The studied aluminum supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminum metal were investigated. The impact of the aluminum production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it come to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.

Keywords: Life cycle assessment, aluminum production, Supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4650
811 A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

Authors: J. Prakash, K. Rajesh

Abstract:

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Keywords: Circular Hough Transform, Coin detection, Covariance matrix, Eigenvalues, Raster scan Algorithm, Texton.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
810 A Framework for University Social Responsibility and Sustainability: The Case of South Valley University, Egypt

Authors: Alaa Tag Eldin Mohamed

Abstract:

The environmental, cultural, social, and technological changes have led higher education institutes to question their traditional roles. Many declarations and frameworks highlight the importance of fulfilling social responsibility of higher education institutes. The study aims at developing a framework of university social responsibility and sustainability (USR&S) with focus on South Valley University (SVU) as a case study of Egyptian Universities. The study used meetings with 12 vice deans of community services and environmental affairs on social responsibility and environmental issues. The proposed framework integrates social responsibility with strategic management through the establishment and maintenance of the vision, mission, values, goals and management systems; elaboration of policies; provision of actions; evaluation of services and development of social collaboration with stakeholders to meet current and future needs of the community and environment. The framework links between different stakeholders internally and externally using communication and reporting tools. The results show that SVU integrates social responsibility and sustainability in its strategic plans. It has policies and actions however fragmented and lack of appropriate structure and budgeting. The proposed framework could be valuable for researchers and decision makers of the Egyptian Universities. The study proposed recommendations and highlighted building on the results and conducting future research.

Keywords: Corporate social responsibility (CSR), South Valley University, Sustainable University, university social responsibility and sustainability (USR&S).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4111
809 An Approach of Quantum Steganography through Special SSCE Code

Authors: Indradip Banerjee, Souvik Bhattacharyya, Gautam Sanyal

Abstract:

Encrypted messages sending frequently draws the attention of third parties, perhaps causing attempts to break and reveal the original messages. Steganography is introduced to hide the existence of the communication by concealing a secret message in an appropriate carrier like text, image, audio or video. Quantum steganography where the sender (Alice) embeds her steganographic information into the cover and sends it to the receiver (Bob) over a communication channel. Alice and Bob share an algorithm and hide quantum information in the cover. An eavesdropper (Eve) without access to the algorithm can-t find out the existence of the quantum message. In this paper, a text quantum steganography technique based on the use of indefinite articles (a) or (an) in conjunction with the nonspecific or non-particular nouns in English language and quantum gate truth table have been proposed. The authors also introduced a new code representation technique (SSCE - Secret Steganography Code for Embedding) at both ends in order to achieve high level of security. Before the embedding operation each character of the secret message has been converted to SSCE Value and then embeds to cover text. Finally stego text is formed and transmits to the receiver side. At the receiver side different reverse operation has been carried out to get back the original information.

Keywords: Quantum Steganography, SSCE (Secret SteganographyCode for Embedding), Security, Cover Text, Stego Text.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2110
808 An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference

Authors: Ayman A. Aly, Abdallah A. Alshnnaway

Abstract:

The general idea behind the filter is to average a pixel using other pixel values from its neighborhood, but simultaneously to take care of important image structures such as edges. The main concern of the proposed filter is to distinguish between any variations of the captured digital image due to noise and due to image structure. The edges give the image the appearance depth and sharpness. A loss of edges makes the image appear blurred or unfocused. However, noise smoothing and edge enhancement are traditionally conflicting tasks. Since most noise filtering behaves like a low pass filter, the blurring of edges and loss of detail seems a natural consequence. Techniques to remedy this inherent conflict often encompass generation of new noise due to enhancement. In this work a new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of three stages. (1) Define fuzzy sets in the input space to computes a fuzzy derivative for eight different directions (2) construct a set of IFTHEN rules by to perform fuzzy smoothing according to contributions of neighboring pixel values and (3) define fuzzy sets in the output space to get the filtered and edged image. Experimental results are obtained to show the feasibility of the proposed approach with two dimensional objects.

Keywords: Additive noise, edge preserving filtering, fuzzy image filtering, noise reduction, two dimensional mechanical images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
807 Virtual Conciliation in Colombia: Evaluation of Maturity Level within the Framework of E-Government

Authors: Jenny Paola Forero Pachón, Sonia Cristina Gamboa Sarmiento, Luis Carlos Gómez Flórez

Abstract:

The Colombian government has defined an e-government strategy to take advantage of Information Technologies (IT) in order to contribute to the building of a more efficient, transparent and participative State that provides better services to citizens and businesses. In this regard, the Justice sector is one of the government sectors where IT has generated more expectation considering that the country has a judicial processes backlog. This situation has led to the search for alternative forms of access to justice that speed up the process while providing a low cost for citizens. To this end, the Colombian government has authorized the use of Alternative Dispute Resolution methods (ADR), a remedy where disputes can be resolved more quickly compared to judicial processes while facilitating greater communication between the parties, without recourse to judicial authority. One of these methods is conciliation, which includes a special modality that takes advantage of IT for the development of itself known as virtual conciliation. With this option the conciliation is supported by information systems, applications or platforms and communications are provided through it. This paper evaluates the level of maturity in how the service of virtual conciliation is under the framework of this strategy. This evaluation is carried out considering Shahkooh's 5-phase model for e-government. As a result, it is evident that in the context of conciliation, maturity does not reach the necessary level in the model so that it can be considered as virtual conciliation; therefore, it is necessary to define strategies to maximize the potential of IT in this context.

Keywords: Alternative dispute resolution, e-government, evaluation of maturity, Shahkooh model, virtual conciliation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 952
806 Nuclear Medical Image Treatment System Based On FPGA in Real Time

Authors: B. Mahmoud, M.H. Bedoui, R. Raychev, H. Essabbah

Abstract:

We present in this paper an acquisition and treatment system designed for semi-analog Gamma-camera. It consists of a nuclear medical Image Acquisition, Treatment and Display chain(IATD) ensuring the acquisition, the treatment of the signals(resulting from the Gamma-camera detection head) and the scintigraphic image construction in real time. This chain is composed by an analog treatment board and a digital treatment board. We describe the designed systems and the digital treatment algorithms in which we have improved the performance and the flexibility. The digital treatment algorithms are implemented in a specific reprogrammable circuit FPGA (Field Programmable Gate Array).interface for semi-analog cameras of Sopha Medical Vision(SMVi) by taking as example SOPHY DS7. The developed system consists of an Image Acquisition, Treatment and Display (IATD) ensuring the acquisition and the treatment of the signals resulting from the DH. The developed chain is formed by a treatment analog board and a digital treatment board designed around a DSP [2]. In this paper we have presented the architecture of a new version of our chain IATD in which the integration of the treatment algorithms is executed on an FPGA (Field Programmable Gate Array)

Keywords: Nuclear medical image, scintigraphic image, digitaltreatment, linearity, spectrometry, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
805 Weighted Data Replication Strategy for Data Grid Considering Economic Approach

Authors: N. Mansouri, A. Asadi

Abstract:

Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.

Keywords: Data grid, data replication, simulation, replica selection, replica placement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2111
804 Least Square-SVM Detector for Wireless BPSK in Multi-Environmental Noise

Authors: J. P. Dubois, Omar M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a statistical learning tool developed to a more complex concept of structural risk minimization (SRM). In this paper, SVM is applied to signal detection in communication systems in the presence of channel noise in various environments in the form of Rayleigh fading, additive white Gaussian background noise (AWGN), and interference noise generalized as additive color Gaussian noise (ACGN). The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these advanced stochastic noise models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to conventional binary signaling optimal model-based detector driven by binary phase shift keying (BPSK) modulation. We show that the SVM performance is superior to that of conventional matched filter-, innovation filter-, and Wiener filter-driven detectors, even in the presence of random Doppler carrier deviation, especially for low SNR (signal-to-noise ratio) ranges. For large SNR, the performance of the SVM was similar to that of the classical detectors. However, the convergence between SVM and maximum likelihood detection occurred at a higher SNR as the noise environment became more hostile.

Keywords: Colour noise, Doppler shift, innovation filter, least square-support vector machine, matched filter, Rayleigh fading, Wiener filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817
803 MHD Chemically Reacting Viscous Fluid Flow towards a Vertical Surface with Slip and Convective Boundary Conditions

Authors: Ibrahim Yakubu Seini, Oluwole Daniel Makinde

Abstract:

MHD chemically reacting viscous fluid flow towards a vertical surface with slip and convective boundary conditions has been conducted. The temperature and the chemical species concentration of the surface and the velocity of the external flow are assumed to vary linearly with the distance from the vertical surface. The governing differential equations are modeled and transformed into systems of ordinary differential equations, which are then solved numerically by a shooting method. The effects of various parameters on the heat and mass transfer characteristics are discussed. Graphical results are presented for the velocity, temperature, and concentration profiles whilst the skin-friction coefficient and the rate of heat and mass transfers near the surface are presented in tables and discussed. The results revealed that increasing the strength of the magnetic field increases the skin-friction coefficient and the rate of heat and mass transfers toward the surface. The velocity profiles are increased towards the surface due to the presence of the Lorenz force, which attracts the fluid particles near the surface. The rate of chemical reaction is seen to decrease the concentration boundary layer near the surface due to the destructive chemical reaction occurring near the surface.

Keywords: Boundary layer, surface slip, MHD flow, chemical reaction, heat transfer, mass transfer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2242
802 Use of Carica papaya as a Bio-Sorbent for Removal of Heavy Metals in Wastewater

Authors: W. E. Igwegbe, B. C. Okoro, J. C. Osuagwu

Abstract:

The study assessed the effectiveness of Pawpaw (Carica papaya) wood in reducing the concentrations of heavy metals in wastewater acting as a bio-sorbent. The following heavy metals were considered; Zinc, Cadmium, Lead, Copper, Iron, Selenium, Nickel and Manganese. The physiochemical properties of Carica papaya stem were studied. The experimental sample was sourced from the trunk of a felled matured pawpaw tree. Wastewater for experimental use was prepared by dissolving soil samples collected from a dump site at Owerri, Imo state of Nigeria in water. The concentration of each metal remaining in solution as residual metal after bio-sorption was determined using Atomic absorption Spectrometer. The effects of pH and initial heavy metal concentration were studied in a batch reactor. The results of Spectrometer test showed that there were different functional groups detected in the Carica papaya stem biomass. There was increase in metal removal as the pH increased for all the metals considered except for Nickel and Manganese. Optimum bio-sorption occurred at pH 5.9 with 5g/100ml solution of bio-sorbent. The results of the study showed that the treated wastewater is fit for irrigation purpose based on Canada wastewater quality guideline for the protection of Agricultural standard. This approach thus provides a cost effective and environmentally friendly option for treating wastewater.

Keywords: Biomass, bio-sorption, Carica papaya, heavy metal, wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2826
801 Shear Layer Investigation through a High-Load Cascade in Low-Pressure Gas Turbine Conditions

Authors: Mehdi Habibnia Rami, Shidvash Vakilipour, Mohammad H. Sabour, Rouzbeh Riazi, Hossein Hassannia

Abstract:

This paper deals with the steady and unsteady flow behavior on the separation bubble occurring on the rear portion of the suction side of T106A blade. The first phase was to implement the steady condition capturing the separation bubble. To accurately predict the separated region, the effects of three different turbulence models and computational grids were separately investigated. The results of Large Eddy Simulation (LES) model on the finest grid structure are acceptably in a good agreement with its relevant experimental results. The second phase is mainly to address the effects of wake entrance on bubble disappearance in unsteady situation. In the current simulations, from what was suggested in an experiment, simulating the flow unsteadiness, with concentrations on small scale disturbances instead of simulating a complete oncoming wake, is the key issue. Subsequently, the results from the current strategy to apply the effects of the wake and two other experimental work were compared to be in a good agreement. Between the two experiments, one of them deals with wake passing unsteady flow, and the other one implements experimentally the same approach as the current Computational Fluid Dynamics (CFD) simulation.

Keywords: T106A turbine cascade, shear-layer separation, steady and unsteady conditions, turbulence models, OpenFOAM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 740
800 Information Retrieval: A Comparative Study of Textual Indexing Using an Oriented Object Database (db4o) and the Inverted File

Authors: Mohammed Erritali

Abstract:

The growth in the volume of text data such as books and articles in libraries for centuries has imposed to establish effective mechanisms to locate them. Early techniques such as abstraction, indexing and the use of classification categories have marked the birth of a new field of research called "Information Retrieval". Information Retrieval (IR) can be defined as the task of defining models and systems whose purpose is to facilitate access to a set of documents in electronic form (corpus) to allow a user to find the relevant ones for him, that is to say, the contents which matches with the information needs of the user. Most of the models of information retrieval use a specific data structure to index a corpus which is called "inverted file" or "reverse index". This inverted file collects information on all terms over the corpus documents specifying the identifiers of documents that contain the term in question, the frequency of each term in the documents of the corpus, the positions of the occurrences of the word... In this paper we use an oriented object database (db4o) instead of the inverted file, that is to say, instead to search a term in the inverted file, we will search it in the db4o database. The purpose of this work is to make a comparative study to see if the oriented object databases may be competing for the inverse index in terms of access speed and resource consumption using a large volume of data.

Keywords: Information Retrieval, indexation, oriented object database (db4o), inverted file.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
799 Using Genetic Algorithms to Outline Crop Rotations and a Cropping-System Model

Authors: Nicolae Bold, Daniel Nijloveanu

Abstract:

The idea of cropping-system is a method used by farmers. It is an environmentally-friendly method, protecting the natural resources (soil, water, air, nutritive substances) and increase the production at the same time, taking into account some crop particularities. The combination of this powerful method with the concepts of genetic algorithms results into a possibility of generating sequences of crops in order to form a rotation. The usage of this type of algorithms has been efficient in solving problems related to optimization and their polynomial complexity allows them to be used at solving more difficult and various problems. In our case, the optimization consists in finding the most profitable rotation of cultures. One of the expected results is to optimize the usage of the resources, in order to minimize the costs and maximize the profit. In order to achieve these goals, a genetic algorithm was designed. This algorithm ensures the finding of several optimized solutions of cropping-systems possibilities which have the highest profit and, thus, which minimize the costs. The algorithm uses genetic-based methods (mutation, crossover) and structures (genes, chromosomes). A cropping-system possibility will be considered a chromosome and a crop within the rotation is a gene within a chromosome. Results about the efficiency of this method will be presented in a special section. The implementation of this method would bring benefits into the activity of the farmers by giving them hints and helping them to use the resources efficiently.

Keywords: Genetic algorithm, chromosomes, genes, cropping, agriculture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1603
798 Application of Statistical Approach for Optimizing CMCase Production by Bacillus tequilensis S28 Strain via Submerged Fermentation Using Wheat Bran as Carbon Source

Authors: A. Sharma, R. Tewari, S. K. Soni

Abstract:

Biofuels production has come forth as a future technology to combat the problem of depleting fossil fuels. Bio-based ethanol production from enzymatic lignocellulosic biomass degradation serves an efficient method and catching the eye of scientific community. High cost of the enzyme is the major obstacle in preventing the commercialization of this process. Thus main objective of the present study was to optimize composition of medium components for enhancing cellulase production by newly isolated strain of Bacillus tequilensis. Nineteen factors were taken into account using statistical Plackett-Burman Design. The significant variables influencing the cellulose production were further employed in statistical Response Surface Methodology using Central Composite Design for maximizing cellulase production. The optimum medium composition for cellulase production was: peptone (4.94 g/L), ammonium chloride (4.99 g/L), yeast extract (2.00 g/L), Tween-20 (0.53 g/L), calcium chloride (0.20 g/L) and cobalt chloride (0.60 g/L) with pH 7, agitation speed 150 rpm and 72 h incubation at 37oC. Analysis of variance (ANOVA) revealed high coefficient of determination (R2) of 0.99. Maximum cellulase productivity of 11.5 IU/ml was observed against the model predicted value of 13 IU/ml. This was found to be optimally active at 60oC and pH 5.5.

Keywords: Bacillus tequilensis, CMCase, Submerged Fermentation, Optimization, Plackett-Burman Design, Response Surface Methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3066
797 Development of a Software about Calculating the Production Parameters in Knitted Garment Plants

Authors: Ender Bulgun, Arzu Vuruskan

Abstract:

Apparel product development is an important stage in the life cycle of a product. Shortening this stage will help to reduce the costs of a garment. The aim of this study is to examine the production parameters in knitwear apparel companies by defining the unit costs, and developing a software to calculate the unit costs of garments and make the cost estimates. In this study, with the help of a questionnaire, different companies- systems of unit cost estimating and cost calculating were tried to be analyzed. Within the scope of the questionnaire, the importance of cost estimating process for apparel companies and the expectations from a new cost estimating program were investigated. According to the results of the questionnaire, it was seen that the majority of companies which participated to the questionnaire use manual cost calculating methods or simple Microsoft Excel spreadsheets to make cost estimates. Furthermore, it was discovered that many companies meet with difficulties in archiving the cost data for future use and as a solution to that problem, it is thought that prior to making a cost estimate, sub units of garment costs which are fabric, accessory and the labor costs should be analyzed and added to the database of the programme beforehand. Another specification of the cost estimating unit prepared in this study is that the programme was designed to consist of two main units, one of which makes the product specification and the other makes the cost calculation. The programme is prepared as a web-based application in order that the supplier, the manufacturer and the customer can have the opportunity to communicate through the same platform.

Keywords: Apparel, cost estimating, design archive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2986
796 An Approach for Vocal Register Recognition Based on Spectral Analysis of Singing

Authors: Aleksandra Zysk, Pawel Badura

Abstract:

Recognizing and controlling vocal registers during singing is a difficult task for beginner vocalist. It requires among others identifying which part of natural resonators is being used when a sound propagates through the body. Thus, an application has been designed allowing for sound recording, automatic vocal register recognition (VRR), and a graphical user interface providing real-time visualization of the signal and recognition results. Six spectral features are determined for each time frame and passed to the support vector machine classifier yielding a binary decision on the head or chest register assignment of the segment. The classification training and testing data have been recorded by ten professional female singers (soprano, aged 19-29) performing sounds for both chest and head register. The classification accuracy exceeded 93% in each of various validation schemes. Apart from a hard two-class clustering, the support vector classifier returns also information on the distance between particular feature vector and the discrimination hyperplane in a feature space. Such an information reflects the level of certainty of the vocal register classification in a fuzzy way. Thus, the designed recognition and training application is able to assess and visualize the continuous trend in singing in a user-friendly graphical mode providing an easy way to control the vocal emission.

Keywords: Classification, singing, spectral analysis, vocal emission, vocal register.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314