Search results for: binary pixels
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 779

Search results for: binary pixels

569 Image Steganography Using Predictive Coding for Secure Transmission

Authors: Baljit Singh Khehra, Jagreeti Kaur

Abstract:

In this paper, steganographic strategy is used to hide the text file inside an image. To increase the storage limit, predictive coding is utilized to implant information. In the proposed plan, one can exchange secure information by means of predictive coding methodology. The predictive coding produces high stego-image. The pixels are utilized to insert mystery information in it. The proposed information concealing plan is powerful as contrasted with the existing methodologies. By applying this strategy, a provision helps clients to productively conceal the information. Entropy, standard deviation, mean square error and peak signal noise ratio are the parameters used to evaluate the proposed methodology. The results of proposed approach are quite promising.

Keywords: cryptography, steganography, reversible image, predictive coding

Procedia PDF Downloads 390
568 Semiconductor Properties of Natural Phosphate Application to Photodegradation of Basic Dyes in Single and Binary Systems

Authors: Y. Roumila, D. Meziani, R. Bagtache, K. Abdmeziem, M. Trari

Abstract:

Heterogeneous photocatalysis over semiconductors has proved its effectiveness in the treatment of wastewaters since it works under soft conditions. It has emerged as a promising technique, giving rise to less toxic effluents and offering the opportunity of using sunlight as a sustainable and renewable source of energy. Many compounds have been used as photocatalysts. Though synthesized ones are intensively used, they remain expensive, and their synthesis involves special conditions. We thus thought of implementing a natural material, a phosphate ore, due to its low cost and great availability. Our work is devoted to the removal of hazardous organic pollutants, which cause several environmental problems and health risks. Among them, dye pollutants occupy a large place. This work relates to the study of the photodegradation of methyl violet (MV) and rhodamine B (RhB), in single and binary systems, under UV light and sunlight irradiation. Methyl violet is a triarylmethane dye, while RhB is a heteropolyaromatic dye belonging to the Xanthene family. In the first part of this work, the natural compound was characterized using several physicochemical and photo-electrochemical (PEC) techniques: X-Ray diffraction, chemical, and thermal analyses scanning electron microscopy, UV-Vis diffuse reflectance measurements, and FTIR spectroscopy. The electrochemical and photoelectrochemical studies were performed with a Voltalab PGZ 301 potentiostat/galvanostat at room temperature. The structure of the phosphate material was well characterized. The photo-electrochemical (PEC) properties are crucial for drawing the energy band diagram, in order to suggest the formation of radicals and the reactions involved in the dyes photo-oxidation mechanism. The PEC characterization of the natural phosphate was investigated in neutral solution (Na₂SO₄, 0.5 M). The study revealed the semiconducting behavior of the phosphate rock. Indeed, the thermal evolution of the electrical conductivity was well fitted by an exponential type law, and the electrical conductivity increases with raising the temperature. The Mott–Schottky plot and current-potential J(V) curves recorded in the dark and under illumination clearly indicate n-type behavior. From the results of photocatalysis, in single solutions, the changes in MV and RhB absorbance in the function of time show that practically all of the MV was removed after 240 mn irradiation. For RhB, the complete degradation was achieved after 330 mn. This is due to its complex and resistant structure. In binary systems, it is only after 120 mn that RhB begins to be slowly removed, while about 60% of MV is already degraded. Once nearly all of the content of MV in the solution has disappeared (after about 250 mn), the remaining RhB is degraded rapidly. This behaviour is different from that observed in single solutions where both dyes are degraded since the first minutes of irradiation.

Keywords: environment, organic pollutant, phosphate ore, photodegradation

Procedia PDF Downloads 104
567 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 480
566 MSG Image Encryption Based on AES and RSA Algorithms "MSG Image Security"

Authors: Boukhatem Mohammed Belkaid, Lahdir Mourad

Abstract:

In this paper, we propose a new encryption system for security issues meteorological images from Meteosat Second Generation (MSG), which generates 12 images every 15 minutes. The hybrid encryption scheme is based on AES and RSA algorithms to validate the three security services are authentication, integrity and confidentiality. Privacy is ensured by AES, authenticity is ensured by the RSA algorithm. Integrity is assured by the basic function of the correlation between adjacent pixels. Our system generates a unique password every 15 minutes that will be used to encrypt each frame of the MSG meteorological basis to strengthen and ensure his safety. Several metrics have been used for various tests of our analysis. For the integrity test, we noticed the efficiencies of our system and how the imprint cryptographic changes at reception if a change affects the image in the transmission channel.

Keywords: AES, RSA, integrity, confidentiality, authentication, satellite MSG, encryption, decryption, key, correlation

Procedia PDF Downloads 353
565 Improvement of Data Transfer over Simple Object Access Protocol (SOAP)

Authors: Khaled Ahmed Kadouh, Kamal Ali Albashiri

Abstract:

This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.

Keywords: JAX-WS, SMTP, SOAP, web service, XML

Procedia PDF Downloads 460
564 One vs. Rest and Error Correcting Output Codes Principled Rebalancing Schemes for Solving Imbalanced Multiclass Problems

Authors: Alvaro Callejas-Ramos, Lorena Alvarez-Perez, Alexander Benitez-Buenache, Anibal R. Figueiras-Vidal

Abstract:

This contribution presents a promising formulation which allows to extend the principled binary rebalancing procedures, also known as neutral re-balancing mechanisms in the sense that they do not alter the likelihood ratio

Keywords: Bregman divergences, imbalanced multiclass classifi-cation, informed re-balancing, invariant likelihood ratio

Procedia PDF Downloads 178
563 Content-Based Color Image Retrieval Based on the 2-D Histogram and Statistical Moments

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach can overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.

Keywords: 2-D histogram, statistical moments, indexing, similarity distance, histograms intersection

Procedia PDF Downloads 422
562 Experimental Study on Strength Development of Low Cement Concrete Using Mix Design for Both Binary and Ternary Mixes

Authors: Mulubrhan Berihu, Supratic Gupta, Zena Gebriel

Abstract:

Due to the design versatility, availability, and cost efficiency, concrete is continuing to be the most used construction material on earth. However, the production of Portland cement, the primary component of concrete mix is causing to have a serious effect on environmental and economic impacts. This shows there is a need to study using of supplementary cementitious materials (SCMs). The most commonly used supplementary cementitious materials are wastes and the use of these industrial waste products has technical, economical and environmental benefits besides the reduction of CO2 emission from cement production. The study aims to document the effect on strength property of concrete due to use of low cement by maximizing supplementary cementitious materials like fly ash or marble powder. Based on the different mix proportion of pozzolana and marble powder a range of mix design was formulated. The first part of the project is to study the strength of low cement concrete using fly ash replacement experimentally. The test results showed that using up to 85 kg/m3 of cement is possible for plain concrete works like hollow block concrete to achieve 9.8 Mpa and the experimental results indicates that strength is a function of w/b. In the second part a new set of mix design has been carried out with fly ash and marble powder to study the strength of both binary and ternary mixes. In this experimental study, three groups of mix design (c+FA, c+FA+m and c+m), four sets of mixes for each group were taken up. Experimental results show that c+FA has maintained the best strength and impermeability whereas c+m obtained less compressive strength, poorer permeability and split tensile strength. c+FA shows a big difference in gaining of compressive strength from 7 days to 28 days compression strength compared to others and this obviously shows the slow rate of hydration of fly ash concrete. As the w/b ratio increases the strength decreases significantly. At the same time higher permeability has been seen in the specimens which were tested for three hours than one hour.

Keywords: efficiency factor, cement content, compressive strength, mix proportion, w/c ratio, water permeability, SCMs

Procedia PDF Downloads 179
561 Development of Algorithms for the Study of the Image in Digital Form for Satellite Applications: Extraction of a Road Network and Its Nodes

Authors: Zineb Nougrara

Abstract:

In this paper, we propose a novel methodology for extracting a road network and its nodes from satellite images of Algeria country. This developed technique is a progress of our previous research works. It is founded on the information theory and the mathematical morphology; the information theory and the mathematical morphology are combined together to extract and link the road segments to form a road network and its nodes. We, therefore, have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. In this approach, geometric and radiometric features of roads are integrated by a cost function and a set of selected points of a crossing road. Its performances were tested on satellite images of Algeria country.

Keywords: satellite image, road network, nodes, image analysis and processing

Procedia PDF Downloads 242
560 A Real-time Classification of Lying Bodies for Care Application of Elderly Patients

Authors: E. Vazquez-Santacruz, M. Gamboa-Zuniga

Abstract:

In this paper, we show a methodology for bodies classification in lying state using HOG descriptors and pressures sensors positioned in a matrix form (14 x 32 sensors) on the surface where bodies lie down. it will be done in real time. Our system is embedded in a care robot that can assist the elderly patient and medical staff around to get a better quality of life in and out of hospitals. Due to current technology a limited number of sensors is used, wich results in low-resolution data array, that will be used as image of 14 x 32 pixels. Our work considers the problem of human posture classification with few information (sensors), applying digital process to expand the original data of the sensors and so get more significant data for the classification, however, this is done with low-cost algorithms to ensure the real-time execution.

Keywords: real-time classification, sensors, robots, health care, elderly patients, artificial intelligence

Procedia PDF Downloads 828
559 Direct Approach in Modeling Particle Breakage Using Discrete Element Method

Authors: Ebrahim Ghasemi Ardi, Ai Bing Yu, Run Yu Yang

Abstract:

Current study is aimed to develop an available in-house discrete element method (DEM) code and link it with direct breakage event. So, it became possible to determine the particle breakage and then its fragments size distribution, simultaneous with DEM simulation. It directly applies the particle breakage inside the DEM computation algorithm and if any breakage happens the original particle is replaced with daughters. In this way, the calculation will be followed based on a new updated particles list which is very similar to the real grinding environment. To validate developed model, a grinding ball impacting an unconfined particle bed was simulated. Since considering an entire ball mill would be too computationally demanding, this method provided a simplified environment to test the model. Accordingly, a representative volume of the ball mill was simulated inside a box, which could emulate media (ball)–powder bed impacts in a ball mill and during particle bed impact tests. Mono, binary and ternary particle beds were simulated to determine the effects of granular composition on breakage kinetics. The results obtained from the DEM simulations showed a reduction in the specific breakage rate for coarse particles in binary mixtures. The origin of this phenomenon, commonly known as cushioning or decelerated breakage in dry milling processes, was explained by the DEM simulations. Fine particles in a particle bed increase mechanical energy loss, and reduce and distribute interparticle forces thereby inhibiting the breakage of the coarse component. On the other hand, the specific breakage rate of fine particles increased due to contacts associated with coarse particles. Such phenomenon, known as acceleration, was shown to be less significant, but should be considered in future attempts to accurately quantify non-linear breakage kinetics in the modeling of dry milling processes.

Keywords: particle bed, breakage models, breakage kinetic, discrete element method

Procedia PDF Downloads 166
558 Smallholder Farmers’ Adaptation Strategies and Socioeconomic Determinants of Climate Variability in Boset District, Oromia, Ethiopia

Authors: Hurgesa Hundera, Samuel Shibeshibikeko, Tarike Daba, Tesfaye Ganamo

Abstract:

The study aimed at examining the ongoing adaptation strategies used by smallholder farmers in response to climate variability in Boset district. It also assessed the socioeconomic factors that influence the choice of adaptation strategies of smallholder farmers to climate variability risk. For attaining the objectives of the study, both primary and secondary sources of data were employed. The primary data were obtained through a household questionnaire, key informant interviews, focus group discussions, and observations, while secondary data were acquired through desk review. Questionnaires were distributed and filled by 328 respondents, and they were identified through systematic random sampling technique. Descriptive statistics and binary logistic regression model were applied in this study as the main analytical methods. The findings of the study reveal that the sample households have utilized multiple adaptation strategies in response to climate variability, such as cropping early mature crops, planting drought resistant crops, growing mixed crops on the same farm lands, and others. The results of the binary logistic model revealed that education, sex, age, family size, off farm income, farm experience, access to climate information, access to farm input, and farm size were significant and key factors determining farmers’ choice of adaptation strategies to climate variability in the study area. To enable effective adaptation measures, Ministry of Agriculture and Natural Resource, with its regional bureaus and offices and concerned non–governmental organizations, should consider climate variability in their planning and budgeting in all levels of decision making.

Keywords: adaptation strategies, boset district, climate variability, smallholder farmers

Procedia PDF Downloads 64
557 Enthalpies of Formation of Equiatomic Binary Hafnium Transition Metal Compounds HfM (M=Co, Ir, Os, Pt, Rh, Ru)

Authors: Hadda Krarcha, S. Messaasdi

Abstract:

In order to investigate Hafnium transition metal alloys HfM (M= Co, Ir, Os,Pt, Rh, Ru) phase diagrams in the region of 50/50% atomic ratio, we performed ab initio Full-Potential Linearized Augmented Plane Waves calculations of the enthalpies of formation of HfM compounds at B2 (CsCl) structure type. The obtained enthalpies of formation are discussed and compared to some of the existing models and available experimental data.

Keywords: enthalpy of formation, transition metal, binarry compunds, hafnium

Procedia PDF Downloads 451
556 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 131
555 The Effect of Pixelation on Face Detection: Evidence from Eye Movements

Authors: Kaewmart Pongakkasira

Abstract:

This study investigated how different levels of pixelation affect face detection in natural scenes. Eye movements and reaction times, while observers searched for faces in natural scenes rendered in different ranges of pixels, were recorded. Detection performance for coarse visual detail at lower pixel size (3 x 3) was better than with very blurred detail carried by higher pixel size (9 x 9). The result is consistent with the notion that face detection relies on gross detail information of face-shape template, containing crude shape structure and features. In contrast, detection was impaired when face shape and features are obscured. However, it was considered that the degradation of scenic information might also contribute to the effect. In the next experiment, a more direct measurement of the effect of pixelation on face detection, only the embedded face photographs, but not the scene background, will be filtered.

Keywords: eye movements, face detection, face-shape information, pixelation

Procedia PDF Downloads 288
554 Temperature Contour Detection of Salt Ice Using Color Thermal Image Segmentation Method

Authors: Azam Fazelpour, Saeed Reza Dehghani, Vlastimil Masek, Yuri S. Muzychka

Abstract:

The study uses a novel image analysis based on thermal imaging to detect temperature contours created on salt ice surface during transient phenomena. Thermal cameras detect objects by using their emissivities and IR radiance. The ice surface temperature is not uniform during transient processes. The temperature starts to increase from the boundary of ice towards the center of that. Thermal cameras are able to report temperature changes on the ice surface at every individual moment. Various contours, which show different temperature areas, appear on the ice surface picture captured by a thermal camera. Identifying the exact boundary of these contours is valuable to facilitate ice surface temperature analysis. Image processing techniques are used to extract each contour area precisely. In this study, several pictures are recorded while the temperature is increasing throughout the ice surface. Some pictures are selected to be processed by a specific time interval. An image segmentation method is applied to images to determine the contour areas. Color thermal images are used to exploit the main information. Red, green and blue elements of color images are investigated to find the best contour boundaries. The algorithms of image enhancement and noise removal are applied to images to obtain a high contrast and clear image. A novel edge detection algorithm based on differences in the color of the pixels is established to determine contour boundaries. In this method, the edges of the contours are obtained according to properties of red, blue and green image elements. The color image elements are assessed considering their information. Useful elements proceed to process and useless elements are removed from the process to reduce the consuming time. Neighbor pixels with close intensities are assigned in one contour and differences in intensities determine boundaries. The results are then verified by conducting experimental tests. An experimental setup is performed using ice samples and a thermal camera. To observe the created ice contour by the thermal camera, the samples, which are initially at -20° C, are contacted with a warmer surface. Pictures are captured for 20 seconds. The method is applied to five images ,which are captured at the time intervals of 5 seconds. The study shows the green image element carries no useful information; therefore, the boundary detection method is applied on red and blue image elements. In this case study, the results indicate that proposed algorithm shows the boundaries more effective than other edges detection methods such as Sobel and Canny. Comparison between the contour detection in this method and temperature analysis, which states real boundaries, shows a good agreement. This color image edge detection method is applicable to other similar cases according to their image properties.

Keywords: color image processing, edge detection, ice contour boundary, salt ice, thermal image

Procedia PDF Downloads 276
553 Examination of the Relationship between Managerial Competence and Job Satisfacti̇on and Career Satisfacti̇on in Sports Managers'

Authors: Omur F. Karakullukcu, Bilal Okudan, Yusuf Can

Abstract:

The aim of this study is to analyze sports managers’ managerial competence levels and job satisfaction’s correlation with career satisfaction. In the study, it has also been analyzed if there is any significant difference in sports managers’ managerial competence, job and career satisfaction in terms of gender, age, duty status, year of service and level of education. 256 sports managers, who work at department of sports service’s central and field organization at least as a chief in the manager position, have been chosen with random sampling method and they have voluntarily participated in the study. In the study, the managerial competence scale which was developed by Cetinkaya (2009), job satisfaction scale developed by Weiss at al.(1967) and Career Satisfaction Scale developed by Vatansever (2008) have been used as a data collection tool. The questionnaire form used as a data collection tool in the study includes a personal information form consisting of 5 questions; questioning gender, age, duty status, years of service and level of education. In the study, pearson correlation analysis has been used for defining the correlation of managerial competence levels, job satisfaction, and career satisfaction levels of sports managers. T-test analysis for binary grouping and anova analysis for more than binary groups have been used in the level of self-efficacy, collective and managerial competence in terms of the participants’ duty status, year of service and level of education. According to the research results, it has been found that there is a positive correlation between sports managers’ managerial competence levels, job satisfaction, and career satisfaction levels. Also, the results show that there is a significant difference in managerial competence levels, job satisfaction and career satisfaction of sports managers in terms of duty status, year of service and level of education; however, the results reveal that there is no significant difference in terms of age groups and gender.

Keywords: sports manager, managerial competence, job satisfaction, career satisfaction

Procedia PDF Downloads 242
552 Words Spotting in the Images Handwritten Historical Documents

Authors: Issam Ben Jami

Abstract:

Information retrieval in digital libraries is very important because most famous historical documents occupy a significant value. The word spotting in historical documents is a very difficult notion, because automatic recognition of such documents is naturally cursive, it represents a wide variability in the level scale and translation words in the same documents. We first present a system for the automatic recognition, based on the extraction of interest points words from the image model. The extraction phase of the key points is chosen from the representation of the image as a synthetic description of the shape recognition in a multidimensional space. As a result, we use advanced methods that can find and describe interesting points invariant to scale, rotation and lighting which are linked to local configurations of pixels. We test this approach on documents of the 15th century. Our experiments give important results.

Keywords: feature matching, historical documents, pattern recognition, word spotting

Procedia PDF Downloads 245
551 High Capacity Reversible Watermarking through Interpolated Error Shifting

Authors: Hae-Yeoun Lee

Abstract:

Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.

Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation

Procedia PDF Downloads 291
550 Assessing Prescribed Burn Severity in the Wetlands of the Paraná River -Argentina

Authors: Virginia Venturini, Elisabet Walker, Aylen Carrasco-Millan

Abstract:

Latin America stands at the front of climate change impacts, with forecasts projecting accelerated temperature and sea level rises compared to the global average. These changes are set to trigger a cascade of effects, including coastal retreat, intensified droughts in some nations, and heightened flood risks in others. In Argentina, wildfires historically affected forests, but since 2004, wetland fires have emerged as a pressing concern. By 2021, the wetlands of the Paraná River faced a dangerous situation. In fact, during the year 2021, a high-risk scenario was naturally formed in the wetlands of the Paraná River, in Argentina. Very low water levels in the rivers, and excessive standing dead plant material (fuel), triggered most of the fires recorded in the vast wetland region of the Paraná during 2020-2021. During 2008 fire events devastated nearly 15% of the Paraná Delta, and by late 2021 new fires burned more than 300,000 ha of these same wetlands. Therefore, the goal of this work is to explore remote sensing tools to monitor environmental conditions and the severity of prescribed burns in the Paraná River wetlands. Thus, two prescribed burning experiments were carried out in the study area (31°40’ 05’’ S, 60° 34’ 40’’ W) during September 2023. The first experiment was carried out on Sept. 13th, in a plot of 0.5 ha which dominant vegetation were Echinochloa sp., and Thalia, while the second trial was done on Sept 29th in a plot of 0.7 ha, next to the first burned parcel; here the dominant vegetation species were Echinochloa sp. and Solanum glaucophyllum. Field campaigns were conducted between September 8th and November 8th to assess the severity of the prescribed burns. Flight surveys were conducted utilizing a DJI® Inspire II drone equipped with a Sentera® NDVI camera. Then, burn severity was quantified by analyzing images captured by the Sentera camera along with data from the Sentinel 2 satellite mission. This involved subtracting the NDVI images obtained before and after the burn experiments. The results from both data sources demonstrate a highly heterogeneous impact of fire within the patch. Mean severity values obtained with drone NDVI images of the first experience were about 0.16 and 0.18 with Sentinel images. For the second experiment, mean values obtained with the drone were approximately 0.17 and 0.16 with Sentinel images. Thus, most of the pixels showed low fire severity and only a few pixels presented moderated burn severity, based on the wildfire scale. The undisturbed plots maintained consistent mean NDVI values throughout the experiments. Moreover, the severity assessment of each experiment revealed that the vegetation was not completely dry, despite experiencing extreme drought conditions.

Keywords: prescribed-burn, severity, NDVI, wetlands

Procedia PDF Downloads 23
549 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study

Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming

Abstract:

Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.

Keywords: binary outcomes, statistical methods, clinical trials, simulation study

Procedia PDF Downloads 78
548 Strength Evaluation by Finite Element Analysis of Mesoscale Concrete Models Developed from CT Scan Images of Concrete Cube

Authors: Nirjhar Dhang, S. Vinay Kumar

Abstract:

Concrete is a non-homogeneous mix of coarse aggregates, sand, cement, air-voids and interfacial transition zone (ITZ) around aggregates. Adoption of these complex structures and material properties in numerical simulation would lead us to better understanding and design of concrete. In this work, the mesoscale model of concrete has been prepared from X-ray computerized tomography (CT) image. These images are converted into computer model and numerically simulated using commercially available finite element software. The mesoscale models are simulated under the influence of compressive displacement. The effect of shape and distribution of aggregates, continuous and discrete ITZ thickness, voids, and variation of mortar strength has been investigated. The CT scan of concrete cube consists of series of two dimensional slices. Total 49 slices are obtained from a cube of 150mm and the interval of slices comes approximately 3mm. In CT scan images, the same cube can be CT scanned in a non-destructive manner and later the compression test can be carried out in a universal testing machine (UTM) for finding its strength. The image processing and extraction of mortar and aggregates from CT scan slices are performed by programming in Python. The digital colour image consists of red, green and blue (RGB) pixels. The conversion of RGB image to black and white image (BW) is carried out, and identification of mesoscale constituents is made by putting value between 0-255. The pixel matrix is created for modeling of mortar, aggregates, and ITZ. Pixels are normalized to 0-9 scale considering the relative strength. Here, zero is assigned to voids, 4-6 for mortar and 7-9 for aggregates. The value between 1-3 identifies boundary between aggregates and mortar. In the next step, triangular and quadrilateral elements for plane stress and plane strain models are generated depending on option given. Properties of materials, boundary conditions, and analysis scheme are specified in this module. The responses like displacement, stresses, and damages are evaluated by ABAQUS importing the input file. This simulation evaluates compressive strengths of 49 slices of the cube. The model is meshed with more than sixty thousand elements. The effect of shape and distribution of aggregates, inclusion of voids and variation of thickness of ITZ layer with relation to load carrying capacity, stress-strain response and strain localizations of concrete have been studied. The plane strain condition carried more load than plane stress condition due to confinement. The CT scan technique can be used to get slices from concrete cores taken from the actual structure, and the digital image processing can be used for finding the shape and contents of aggregates in concrete. This may be further compared with test results of concrete cores and can be used as an important tool for strength evaluation of concrete.

Keywords: concrete, image processing, plane strain, interfacial transition zone

Procedia PDF Downloads 216
547 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 147
546 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network

Authors: Abdulaziz Alsadhan, Naveed Khan

Abstract:

In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.

Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)

Procedia PDF Downloads 336
545 The Role of Brooding and Reflective as Subtypes of Rumination toward Psychological Distress in University of Indonesia First-Year Undergraduate Students

Authors: Hepinda Fajari Nuharini, Sugiarti A. Musabiq

Abstract:

Background: Various and continuous pressures that exceed individual resources can cause first-year undergraduate college students to experience psychological distress. Psychological distress can occur when individuals use rumination as cognitive coping strategies. Rumination is one of the cognitive coping strategies that can be used by individuals to respond to psychological distress that causes individuals to think about the causes and consequences of events that have occurred. Rumination had two subtypes, such as brooding and reflective. Therefore, the purpose of this study was determining the role of brooding and reflective as subtypes of rumination toward psychological distress in University of Indonesia first-year undergraduate students. Methods: Participants of this study were 403 University of Indonesia first-year undergraduate students aged between 18 and 21 years old. Psychological distress measured using self reporting questionnaire (SRQ-20) and brooding and reflective as subtypes of rumination measured using Ruminative Response Scale - Short Version (RRS - Short Version). Results: Binary logistic regression analyses showed that 22.8% of the variation in psychological distress could be explained by the brooding and reflective as subtypes of rumination, while 77.2% of the variation in psychological distress could be explained by other factors (Nagelkerke R² = 0,228). The results of the binary logistic regression analysis also showed rumination subtype brooding is a significant predictor of psychological distress (b = 0,306; p < 0.05), whereas rumination subtype reflective is not a significant predictor of psychological distress (b = 0,073; p > 0.05). Conclusion: The findings of this study showed a positive relationship between brooding and psychological distress indicates that a higher level of brooding will predict higher psychological distress. Meanwhile, a negative relationship between reflective and psychological distress indicates a higher level of reflective will predict lower psychological distress in University of Indonesia first-year undergraduate students. Added Values: The psychological distress among first-year undergraduate students would then have an impact on student academic performance. Therefore, the results of this study can be used as a reference for making preventive action to reduce the percentage and impact of psychological distress among first-year undergraduate students.

Keywords: brooding as subtypes of rumination, first-year undergraduate students, psychological distress, reflective as subtypes of rumination

Procedia PDF Downloads 86
544 Liver Lesion Extraction with Fuzzy Thresholding in Contrast Enhanced Ultrasound Images

Authors: Abder-Rahman Ali, Adélaïde Albouy-Kissi, Manuel Grand-Brochier, Viviane Ladan-Marcus, Christine Hoeffl, Claude Marcus, Antoine Vacavant, Jean-Yves Boire

Abstract:

In this paper, we present a new segmentation approach for focal liver lesions in contrast enhanced ultrasound imaging. This approach, based on a two-cluster Fuzzy C-Means methodology, considers type-II fuzzy sets to handle uncertainty due to the image modality (presence of speckle noise, low contrast, etc.), and to calculate the optimum inter-cluster threshold. Fine boundaries are detected by a local recursive merging of ambiguous pixels. The method has been tested on a representative database. Compared to both Otsu and type-I Fuzzy C-Means techniques, the proposed method significantly reduces the segmentation errors.

Keywords: defuzzification, fuzzy clustering, image segmentation, type-II fuzzy sets

Procedia PDF Downloads 454
543 Current Starved Ring Oscillator Image Sensor

Authors: Devin Atkin, Orly Yadid-Pecht

Abstract:

The continual demands for increasing resolution and dynamic range in CMOS image sensors have resulted in exponential increases in the amount of data that needs to be read out of an image sensor, and existing readouts cannot keep up with this demand. Interesting approaches such as sparse and burst readouts have been proposed and show promise, but at considerable trade-offs in other specifications. To this end, we have begun designing and evaluating various new readout topologies centered around an attempt to parallelize the sensor readout. In this paper, we have designed, simulated, and started testing a new light-controlled oscillator topology with dual column and row readouts. We expect the parallel readout structure to offer greater speed and alleviate the trade-off typical in this topology, where slow pixels present a major framerate bottleneck.

Keywords: CMOS image sensors, high-speed capture, wide dynamic range, light controlled oscillator

Procedia PDF Downloads 57
542 Development of Transgenic Tomato Immunity to Pepino Mosaic Virus and Tomato Yellow Leaf Curl Virus by Gene Silencing Approach

Authors: D. Leibman, D. Wolf, A. Gal-On

Abstract:

Viral diseases of tomato crops result in heavy yield losses and may even jeopardize the production of these crops. Classical tomato breeding for disease resistance against Tomato yellow leaf curl virus (TYLCV), leads to partial resistance associated with a number of recessive genes. To author’s best knowledge Pepino mosaic virus (PepMV) genetic resistance is not yet available. The generation of viral resistance by means of genetic engineering was reported and implemented for many crops, including tomato. Transgenic resistance against viruses is based, in most cases, on Post Transcriptional Gene Silencing (PTGS), an endogenous mechanism which destroys the virus genome. In this work, we developed immunity against PepMV and TYLCV in a tomato based on a PTGS mechanism. Tomato plants were transformed with a hairpin-construct-expressed transgene-derived double-strand-RNA (tr-dsRNA). In the case of PepMV, the binary construct harbored three consecutive fragments of the replicase gene from three different PepMV strains (Italian, Spanish and American), to provide resistance against a range of virus strains. In the case of TYLCV, the binary vector included three consecutive fragments of the IR, V2 and C2 viral genes constructed in a hairpin configuration. Selected transgenic lines (T0) showed a high accumulation of transgene siRNA of 21-24 bases, and T1 transgenic lines showed complete immunity to PepMV and TYLCV. Graft inoculation displayed immunity of the transgenic scion against PepMV and TYLCV. The study presents the engineering of resistance in tomato against two serious diseases, which will help in the production of high-quality tomato. However, unfortunately, these resistant plants have not been implemented due to public ignorance and opposition against breeding by genetic engineering.

Keywords: PepMV, PTGS, TYLCV, tr-dsRNA

Procedia PDF Downloads 99
541 Optimal Site Selection for Temporary Housing regarding Disaster Management Case Study: Tehran Municipality (No.6)

Authors: Ghazaleh Monazami Tehrani, Zhamak Monazami Tehrani, Raziyeh Hadavand

Abstract:

Optimal site selection for temporary housing is one of the most important issues in crisis management. In this research, district six of Tehran city with high frequency and geographical distribution of earthquakes has been selected as a case study for positioning temporary housing after a probable earthquake. For achieving this goal this study tries to identify and evaluate distribution of location according to some standards such as compatible and incompatible urban land uses with utility of GIS and AHP. The results of this study show the most susceptible parts of this region in the center. According to the maps, north eastern part of Kordestan, Shaheed Gomnam intersection possesses the highest pixels value in terms of areal extent, therefore these places are recommended as an optimum site location for construction of emergency evacuation base.

Keywords: optimal site selection, temporary housing , crisis management, AHP, GIS

Procedia PDF Downloads 229
540 An Improved Circulating Tumor Cells Analysis Method for Identifying Tumorous Blood Cells

Authors: Salvador Garcia Bernal, Chi Zheng, Keqi Zhang, Lei Mao

Abstract:

Circulating Tumor Cells (CTC) is used to detect tumoral cell metastases using blood samples of patients with cancer (lung, breast, etc.). Using an immunofluorescent method a three channel image (Red, Green, and Blue) are obtained. These set of images usually overpass the 11 x 30 M pixels in size. An aided tool is designed for imaging cell analysis to segmented and identify the tumorous cell based on the three markers signals. Our Method, it is cell-based (area and cell shape) considering each channel information and extracting and making decisions if it is a valid CTC. The system also gives information about number and size of tumor cells found in the sample. We present results in real-life samples achieving acceptable performance in identifying CTCs in short time.

Keywords: Circulating Tumor Cells (CTC), cell analysis, immunofluorescent, medical image analysis

Procedia PDF Downloads 186