Search results for: image similarity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3334

Search results for: image similarity

1534 Digital Forgery Detection by Signal Noise Inconsistency

Authors: Bo Liu, Chi-Man Pun

Abstract:

A novel technique for digital forgery detection by signal noise inconsistency is proposed in this paper. The forged area spliced from the other picture contains some features which may be inconsistent with the rest part of the image. Noise pattern and the level is a possible factor to reveal such inconsistency. To detect such noise discrepancies, the test picture is initially segmented into small pieces. The noise pattern and level of each segment are then estimated by using various filters. The noise features constructed in this step are utilized in energy-based graph cut to expose forged area in the final step. Experimental results show that our method provides a good illustration of regions with noise inconsistency in various scenarios.

Keywords: forgery detection, splicing forgery, noise estimation, noise

Procedia PDF Downloads 455
1533 Hull Detection from Handwritten Digit Image

Authors: Sriraman Kothuri, Komal Teja Mattupalli

Abstract:

In this paper we proposed a novel algorithm for recognizing hulls in a hand written digits. This is an extension to the work on “Digit Recognition Using Freeman Chain code”. In order to find out the hulls in a user given digit it is necessary to follow three steps. Those are pre-processing, Boundary Extraction and at last apply the Hull Detection system in a way to attain the better results. The detection of Hull Regions is mainly intended to increase the machine learning capability in detection of characters or digits. This can also extend this in order to get the hull regions and their intensities in Black Holes in Space Exploration.

Keywords: chain code, machine learning, hull regions, hull recognition system, SASK algorithm

Procedia PDF Downloads 398
1532 Fractal Nature of Granular Mixtures of Different Concretes Formulated with Different Methods of Formulation

Authors: Fatima Achouri, Kaddour Chouicha, Abdelwahab Khatir

Abstract:

It is clear that concrete of quality must be made with selected materials chosen in optimum proportions that remain after implementation, a minimum of voids in the material produced. The different methods of formulations what we use, are based for the most part on a granular curve which describes an ‘optimal granularity’. Many authors have engaged in fundamental research on granular arrangements. A comparison of mathematical models reproducing these granular arrangements with experimental measurements of compactness have to verify that the minimum porosity P according to the following extent granular exactly a power law. So the best compactness in the finite medium are obtained with power laws, such as Furnas, Fuller or Talbot, each preferring a particular setting between 0.20 and 0.50. These considerations converge on the assumption that the optimal granularity Caquot approximates by a power law. By analogy, it can then be analyzed as a granular structure of fractal-type since the properties that characterize the internal similarity fractal objects are reflected also by a power law. Optimized mixtures may be described as a series of installments falling granular stuff to better the tank on a regular hierarchical distribution which would give at different scales, by cascading effects, the same structure to the mix. Likely this model may be appropriate for the entire extent of the size distribution of the components, since the cement particles (and silica fume) correctly deflocculated, micrometric dimensions, to chippings sometimes several tens of millimeters. As part of this research, the aim is to give an illustration of the application of fractal analysis to characterize the granular concrete mixtures optimized for a so-called fractal dimension where different concretes were studying that we proved a fractal structure of their granular mixtures regardless of the method of formulation or the type of concrete.

Keywords: concrete formulation, fractal character, granular packing, method of formulation

Procedia PDF Downloads 254
1531 The Job of Rhetoric in Public Relations Practice

Authors: Talal Alqahtani

Abstract:

For all institutions, either public or private, communication is important now more than ever. This is because the importance of communication has grown over the years, and it has the ability to either break or make an organization. With globalization, the changing technology, and other emergent issues that affect organizations, the communication given out has had to be better, sharper, and both proactive and reactive. This is the reason why the importance of public relations has been on the increase. Institutions realize the importance of having a good image and having public relations experts who can effectively manage communication in an institution easily in times of crisis. Public relations itself is not, however, effective, and this has led to the adoption of rhetoric in communication. Rhetoric use has had a long transformation because, in the past, it was only used in politics. Rhetoric in communication has come to be appreciated and adopted by many diverse fields and sectors. This study looks at the job of rhetoric in public relations practice and how it can identify with the administration of an institution's notoriety.

Keywords: communication, notoriety, rhetoric, public relation

Procedia PDF Downloads 228
1530 Semigroups of Linear Transformations with Fixed Subspaces: Green’s Relations and Ideals

Authors: Yanisa Chaiya, Jintana Sanwong

Abstract:

Let V be a vector space over a field and W a subspace of V. Let Fix(V,W) denote the set of all linear transformations on V with fix all elements in W. In this paper, we show that Fix(V,W) is a semigroup under the composition of maps and describe Green’s relations on this semigroup in terms of images, kernels and the dimensions of subspaces of the quotient space V/W where V/W = {v+W : v is an element in V} with v+W = {v+w : w is an element in W}. Let dim(U) denote the dimension of a vector space U and Vα = {vα : v is an element in V} where vα is an image of v under a linear transformation α. For any cardinal number a let a'= min{b : b > a}. We also show that the ideals of Fix(V,W) are precisely the sets. Fix(r) ={α ∊ Fix(V,W) : dim(Vα/W) < r} where 1 ≤ r ≤ a' and a = dim(V/W). Moreover, we prove that if V is a finite-dimensional vector space, then every ideal of Fix(V,W) is principle.

Keywords: Green’s relations, ideals, linear transformation semi-groups, principle ideals

Procedia PDF Downloads 291
1529 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana

Authors: Gautier Viaud, Paul-Henry Cournède

Abstract:

Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.

Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models

Procedia PDF Downloads 297
1528 On-Road Text Detection Platform for Driver Assistance Systems

Authors: Guezouli Larbi, Belkacem Soundes

Abstract:

The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.

Keywords: text detection, CNN, PZM, deep learning

Procedia PDF Downloads 80
1527 EEG Diagnosis Based on Phase Space with Wavelet Transforms for Epilepsy Detection

Authors: Mohmmad A. Obeidat, Amjed Al Fahoum, Ayman M. Mansour

Abstract:

The recognition of an abnormal activity of the brain functionality is a vital issue. To determine the type of the abnormal activity either a brain image or brain signal are usually considered. Imaging localizes the defect within the brain area and relates this area with somebody functionalities. However, some functions may be disturbed without affecting the brain as in epilepsy. In this case, imaging may not provide the symptoms of the problem. A cheaper yet efficient approach that can be utilized to detect abnormal activity is the measurement and analysis of the electroencephalogram (EEG) signals. The main goal of this work is to come up with a new method to facilitate the classification of the abnormal and disorder activities within the brain directly using EEG signal processing, which makes it possible to be applied in an on-line monitoring system.

Keywords: EEG, wavelet, epilepsy, detection

Procedia PDF Downloads 535
1526 Production of Organic Solvent Tolerant Hydrolytic Enzymes (Amylase and Protease) by Bacteria Isolated from Soil of a Dairy Farm

Authors: Alok Kumar, Hari Ram, Lebin Thomas, Ved Pal Singh

Abstract:

Organic solvent tolerant amylases and proteases of microbial origin are in great demand for their application in transglycosylation of water-insoluble flavanoids and in peptide synthesizing reaction in organic media. Most of the amylases and proteases are unstable in presence of organic solvent. In the present work two different bacterial strains M-11 and VP-07 were isolated from the soil sample of a dairy farm in Delhi, India, for the efficient production of extracellular amylase and protease through their screening on starch agar (SA) and skimmed milk agar (SMA) plates, respectively. Both the strains (M-11 and VP-07) were identified based on morphological, biochemical and 16S rRNA gene sequencing methods. After analysis through Ez-Taxon software, the strains M-11 and VP-07 were found to have maximum pairwise similarity of 98.63% and 100% with Bacillus subtilis subsp. inaquosorum BGSC 3A28 and Bacillus anthracis ATCC 14578 and were therefore identified as Bacillus sp. UKS1 and Bacillus sp. UKS2, respectively. Time course study of enzyme activity and bacterial growth has shown that both strains exhibited typical sigmoid growth behavior and maximum production of amylase (180 U/ml) and protease (78 U/ml) by these strains (UKS1 and UKS2) was commenced during stationary phase of growth at 24 and 20 h, respectively. Thereafter, both amylase and protease were tested for their tolerance towards organic solvents and were found to be active as well stable in p-xylene (130% and 115%), chloroform (110% and 112%), isooctane (119% and 107%), benzene (121% and 104%), n-hexane (116% and 103%) and toluene (112% and 101%, respectively). Owing to such properties, these enzymes can be exploited for their potential application in industries for organic synthesis.

Keywords: amylase, enzyme activity, industrial applications, organic solvent tolerant, protease

Procedia PDF Downloads 338
1525 Urdu Text Extraction Method from Images

Authors: Samabia Tehsin, Sumaira Kausar

Abstract:

Due to the vast increase in the multimedia data in recent years, efficient and robust retrieval techniques are needed to retrieve and index images/ videos. Text embedded in the images can serve as the strong retrieval tool for images. This is the reason that text extraction is an area of research with increasing attention. English text extraction is the focus of many researchers but very less work has been done on other languages like Urdu. This paper is focusing on Urdu text extraction from video frames. This paper presents a text detection feature set, which has the ability to deal up with most of the problems connected with the text extraction process. To test the validity of the method, it is tested on Urdu news dataset, which gives promising results.

Keywords: caption text, content-based image retrieval, document analysis, text extraction

Procedia PDF Downloads 511
1524 Subjectivity in Miracle Aesthetic Clinic Ambient Media Advertisement

Authors: Wegig Muwonugroho

Abstract:

Subjectivity in advertisement is a ‘power’ possessed by advertisements to construct trend, concept, truth, and ideology through subconscious mind. Advertisements, in performing their functions as message conveyors, use such visual representation to inspire what’s ideal to the people. Ambient media is advertising medium making the best use of the environment where the advertisement is located. Miracle Aesthetic Clinic (Miracle) popularizes the visual representation of its ambient media advertisement through the omission of face-image of both female mannequins that function as its ambient media models. Usually, the face of a model in advertisement is an image commodity having selling values; however, the faces of ambient media models in Miracle advertisement campaign are suppressed over the table and wall. This face concealing aspect creates not only a paradox of subjectivity but also plurality of meaning. This research applies critical discourse analysis method to analyze subjectivity in obtaining the insight of ambient media’s meaning. First, in the stage of textual analysis, the embedding attributes upon female mannequins imply that the models are denoted as the representation of modern women, which are identical with the identities of their social milieus. The communication signs aimed to be constructed are the women who lose their subjectivities and ‘feel embarrassed’ to flaunt their faces to the public because of pimples on their faces. Second, in the stage of analysis of discourse practice, it points out that ambient media as communication media has been comprehensively responded by the targeted audiences. Ambient media has a role as an actor because of its eyes-catching setting, and taking space over the area where the public are wandering around. Indeed, when the public realize that the ambient media models are motionless -unlike human- stronger relation then appears, marked by several responses from targeted audiences. Third, in the stage of analysis of social practice, soap operas and celebrity gossip shows on the television become a dominant discourse influencing advertisement meaning. The subjectivity of Miracle Advertisement corners women by the absence of women participation in public space, the representation of women in isolation, and the portrayal of women as an anxious person in the social rank when their faces suffered from pimples. The Ambient media as the advertisement campaign of Miracle is quite success in constructing a new trend discourse of face beauty that is not limited on benchmarks of common beauty virtues, but the idea of beauty can be presented by ‘when woman doesn’t look good’ visualization.

Keywords: ambient media, advertisement, subjectivity, power

Procedia PDF Downloads 318
1523 Determination and Distribution of Formation Thickness Using Seismic and Well Data in Baga/Lake Sub-basin, Chad Basin Nigeria

Authors: Gabriel Efomeh Omolaiye, Olatunji Seminu, Jimoh Ajadi, Yusuf Ayoola Jimoh

Abstract:

The Nigerian part of the Chad Basin till date has been one of the few critically studied basins, with few published scholarly works, compared to other basins such as Niger Delta, Dahomey, etc. This work was undertaken by the integration of 3D seismic interpretations and the well data analysis of eight wells fairly distributed in block A, Baga/Lake sub-basin in Borno basin with the aim of determining the thickness of Chad, Kerri-Kerri, Fika, and Gongila Formations in the sub-basin. Da-1 well (type-well) used in this study was subdivided into stratigraphic units based on the regional stratigraphic subdivision of the Chad basin and was later correlated with other wells using similarity of observed log responses. The combined density and sonic logs were used to generate synthetic seismograms for seismic to well ties. Five horizons were mapped, representing the tops of the formations on the 3D seismic data covering the block; average velocity function with maximum error/residual of 0.48% was adopted in the time to depth conversion of all the generated maps. There is a general thickening of sediments from the west to the east, and the estimated thicknesses of the various formations in the Baga/Lake sub-basin are Chad Formation (400-750 m), Kerri-Kerri Formation (300-1200 m), Fika Formation (300-2200 m) and Gongila Formation (100-1300 m). The thickness of the Bima Formation could not be established because the deepest well (Da-1) terminates within the formation. This is a modification to the previous and widely referenced studies of over forty decades that based the estimation of formation thickness within the study area on the observed outcrops at different locations and the use of few well data.

Keywords: Baga/Lake sub-basin, Chad basin, formation thickness, seismic, velocity

Procedia PDF Downloads 178
1522 Modernizer'ness as Madness: A Comparative Historical Study of Emperor Tewodros II of Ethiopia and Sultan Selim III of Ottoman Turkey's Modernization Reforms

Authors: Seid Ahmed Mohammed, Nedim Yalansiz

Abstract:

Many historians hardly gave due attention for historical comparison as their methods of study. They were still stunt supporter of the use of their own historical research method in their studies. But this method lacks the way to analyze some worldwide dynamics of events in comparative perspectives. Some dynamics like revolution, modernization, societal change and transformation needs broader analysis for broadening our historical knowledge’s by comparing and contrasting of the causes, courses and consequences of such dynamics historical developments in the world at large. In this paper, our study focuses up on ‘the dynamics of modernization’ and the challenge of modernity of the old regimes. For instance, countries like Turkey, Ethiopia, China, Russia, Iran, Afghanistan and Thailand have almost the same dynamics in facing the challenge of modernity. In such countries, the old regimes tried to introduce modernization and ‘reform from the above’ in order to tackle the gradual decline of the empire that faced strong challenge from the outside world. The other similarity of them was that as the rulers attempted to introduce the modernization reforms the old traditional and the religious institutions strongly opposed the reforms as the reforms alienated the power and prestige of the traditional classes. Similarly, the rules introduced modernization for maintaining their own unique socio-cultural and religious dynamics not as borrowing and acculturation of the west by complete destruction of their own. Therefore, this paper attempted to give a comparative analysis of two modernizers Tewodros II (1855-1868) of Ethiopia and Sultan Selim III (1739-1808) of Ottoman Turkey who tried to modernize their empire unfortunately they paid their precious life as a result of modernization.

Keywords: comparative history, Ethiopia, modernization, Ottoman Turkey

Procedia PDF Downloads 200
1521 Molecular Diagnosis of Influenza Strains Was Carried Out on Patients of the Social Security Clinic in Karaj Using the RT-PCR Technique

Authors: A. Ferasat, S. Rostampour Yasouri

Abstract:

Seasonal flu is a highly contagious infection caused by influenza viruses. These viruses undergo genetic changes that result in new epidemics across the globe. Medical attention is crucial in severe cases, particularly for the elderly, frail, and those with chronic illnesses, as their immune systems are often weaker. The purpose of this study was to detect new subtypes of the influenza A virus rapidly using a specific RT-PCR method based on the HA gene (hemagglutinin). In the winter and spring of 2022_2023, 120 embryonated egg samples were cultured, suspected of seasonal influenza. RNA synthesis, followed by cDNA synthesis, was performed. Finally, the PCR technique was applied using a pair of specific primers designed based on the HA gene. The PCR product was identified after purification, and the nucleotide sequence of purified PCR products was compared with the sequences in the gene bank. The results showed a high similarity between the sequence of the positive samples isolated from the patients and the sequence of the new strains isolated in recent years. This RT-PCR technique is entirely specific in this study, enabling the detection and multiplication of influenza and its subspecies from clinical samples. The RT-PCR technique based on the HA gene, along with sequencing, is a fast, specific, and sensitive diagnostic method for those infected with influenza viruses and its new subtypes. Rapid molecular diagnosis of influenza is essential for suspected people to control and prevent the spread of the disease to others. It also prevents the occurrence of secondary (sometimes fatal) pneumonia that results from influenza and pathogenic bacteria. The critical role of rapid diagnosis of new strains of influenza is to prepare a drug vaccine against the latest viruses that did not exist in the community last year and are entirely new viruses.

Keywords: influenza, molecular diagnosis, patients, RT-PCR technique

Procedia PDF Downloads 68
1520 Rapid Soil Classification Using Computer Vision, Electrical Resistivity and Soil Strength

Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, Lionel L. J. Ang, Algernon C. S. Hong, Danette S. E. Tan, Grace H. B. Foo, K. Q. Hong, L. M. Cheng, M. L. Leong

Abstract:

This paper presents a novel rapid soil classification technique that combines computer vision with four-probe soil electrical resistivity method and cone penetration test (CPT), to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from local construction projects are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labour-intensive. Thus, a rapid classification method is needed at the SGs. Computer vision, four-probe soil electrical resistivity and CPT were combined into an innovative non-destructive and instantaneous classification method for this purpose. The computer vision technique comprises soil image acquisition using industrial grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). Complementing the computer vision technique, the apparent electrical resistivity of soil (ρ) is measured using a set of four probes arranged in Wenner’s array. It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the soil strength is measured using a modified mini cone penetrometer, and w is measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay” and an even mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay”. It is also found that these parameters can be integrated with the computer vision technique on-site to complete the rapid soil classification in less than three minutes.

Keywords: Computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification

Procedia PDF Downloads 209
1519 A Study of Preliminary Findings of Behavioral Patterns under Captive Conditions in Chinkara (Gazella bennettii) with Prospects for Future Conservation

Authors: Muhammad Idnan, Arshad Javid, Muhammad Nadeem

Abstract:

The present study was conducted from April 2013 to March 2014 to observe the behavioral parameters of Chinkara (Gazella bennettii) under captive conditions by comparing the captive-born and wild-caught animals for conservation strategies. Understanding the behavioral conformations plays a significant role in captive management. Due to human population explosion and mechanized hunting, the captive breeding seems to be the best way for sports hunting, bush meat, for leather industry and horns for traditional medicinal usage. Primarily, captive management has been used on trial and error basis due to deficiency of ethology of this least concerned species. Behavior of [(20 wild-caught (WC) and 10 captive-bred (CB)] adult Chinkara was observed at captive breeding facilities for ungulates at Ravi Campus, University of Veterinary and Animal Sciences at Kasur district which is situated on southeast side of Lahore. The average annual rainfall is about 650 mm, with frequent raining during monsoon. A focal sample was used to observe the various behavioral patterns for CB and WC chinkara. A similarity was observed in behavioral parameters in WC and CB animals, however, when the differences were considered, WC male deer showed a significantly higher degree of agonistic interaction as compared to the CB male chinkara. These findings suggest that there is no immediate impact of captivity on behavior of chinkara nevertheless 10 generations of captivity. It is suggested that the Chinkara is not suitable for domestication and for successful deer farming, a further study is recommended for ethology of chinkara.

Keywords: Chinkara (Gazella bennettii), domestication, deer farming, ex-situ conservation

Procedia PDF Downloads 160
1518 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 328
1517 Tumor Size and Lymph Node Metastasis Detection in Colon Cancer Patients Using MR Images

Authors: Mohammadreza Hedyehzadeh, Mahdi Yousefi

Abstract:

Colon cancer is one of the most common cancer, which predicted to increase its prevalence due to the bad eating habits of peoples. Nowadays, due to the busyness of people, the use of fast foods is increasing, and therefore, diagnosis of this disease and its treatment are of particular importance. To determine the best treatment approach for each specific colon cancer patients, the oncologist should be known the stage of the tumor. The most common method to determine the tumor stage is TNM staging system. In this system, M indicates the presence of metastasis, N indicates the extent of spread to the lymph nodes, and T indicates the size of the tumor. It is clear that in order to determine all three of these parameters, an imaging method must be used, and the gold standard imaging protocols for this purpose are CT and PET/CT. In CT imaging, due to the use of X-rays, the risk of cancer and the absorbed dose of the patient is high, while in the PET/CT method, there is a lack of access to the device due to its high cost. Therefore, in this study, we aimed to estimate the tumor size and the extent of its spread to the lymph nodes using MR images. More than 1300 MR images collected from the TCIA portal, and in the first step (pre-processing), histogram equalization to improve image qualities and resizing to get the same image size was done. Two expert radiologists, which work more than 21 years on colon cancer cases, segmented the images and extracted the tumor region from the images. The next step is feature extraction from segmented images and then classify the data into three classes: T0N0، T3N1 و T3N2. In this article, the VGG-16 convolutional neural network has been used to perform both of the above-mentioned tasks, i.e., feature extraction and classification. This network has 13 convolution layers for feature extraction and three fully connected layers with the softmax activation function for classification. In order to validate the proposed method, the 10-fold cross validation method used in such a way that the data was randomly divided into three parts: training (70% of data), validation (10% of data) and the rest for testing. It is repeated 10 times, each time, the accuracy, sensitivity and specificity of the model are calculated and the average of ten repetitions is reported as the result. The accuracy, specificity and sensitivity of the proposed method for testing dataset was 89/09%, 95/8% and 96/4%. Compared to previous studies, using a safe imaging technique (MRI) and non-use of predefined hand-crafted imaging features to determine the stage of colon cancer patients are some of the study advantages.

Keywords: colon cancer, VGG-16, magnetic resonance imaging, tumor size, lymph node metastasis

Procedia PDF Downloads 54
1516 Genetic Characterization of Acanthamoeba Isolates from Amoebic Keratitis Patients

Authors: Sumeeta Khurana, Kirti Megha, Amit Gupta, Rakesh Sehgal

Abstract:

Background: Amoebic keratitis is a painful vision threatening infection caused by a free living pathogenic amoeba Acanthamoeba. It can be misdiagnosed and very difficult to treat if not suspected early. The epidemiology of Acanthamoeba genotypes causing infection in our geographical area is not yet known to the best of our knowledge. Objective: To characterize Acanthamoeba isolates from amoebic keratitis patients. Methods: A total of 19 isolates obtained from patients with amoebic keratitis presenting to the Advanced Eye Centre at Postgraduate Institute of Medical Education and Research, a tertiary care centre of North India over a period of last 10 years were included. Their corneal scrapings, lens solution and lens case (in case of lens wearer) were collected for microscopic examination, culture and molecular diagnosis. All the isolates were maintained in the Non Nutrient agar culture medium overlaid with E.coli and 13 strains were axenised and maintained in modified Peptone Yeast Dextrose Agar. Identification of Acanthamoeba genotypes was based on amplification of diagnostic fragment 3 (DF3) region of the 18srRNA gene followed by sequencing. Nucleotide similarity search was performed by BLAST search of sequenced amplicons in GenBank database (http//www.ncbi.nlm.nih.gov/blast). Multiple Sequence alignments were determined by using CLUSTAL X. Results: Nine out of 19 Acanthamoeba isolates were found to belong to Genotype T4 followed by 6 isolates of genotype T11, 3 T5 and 1 T3 genotype. Conclusion: T4 is the predominant Acanthamoeba genotype in our geographical area. Further studies should focus on differences in pathogenicity of these genotypes and their clinical significance.

Keywords: Acanthamoeba, free living amoeba, keratitis, genotype, ocular

Procedia PDF Downloads 233
1515 Comparative Parametric and Emission Characteristics of Single Cylinder Spark Ignition Engine Using Gasoline, Ethanol, and H₂O as Micro Emulsion Fuels

Authors: Ufaith Qadri, M Marouf Wani

Abstract:

In this paper, the performance and emission characteristics of a Single Cylinder Spark Ignition engine have been investigated. The research is based on micro emulsion application as fuel in a gasoline engine. We have analyzed many micro emulsion compositions in various proportions, for predicting the performance of the Spark Ignition engine. This new technology of fuel modifications is emerging very rapidly as lot of research is going on in the field of micro emulsion fuels in Compression Ignition engines, but the micro emulsion fuel used in a Gasoline engine is very rare. The use of micro emulsion as fuel in a Spark Ignition engine is virtually unexplored. So, our main goal is to see the performance and emission characteristics of micro emulsions as fuel, in Spark Ignition engines, and finding which composition is more efficient. In this research, we have used various micro emulsion fuels whose composition varies for all the three blends, and their performance and emission characteristic were predicted in AVL Boost software. Conventional Gasoline fuel 90%, 80% and 85% were blended with co-surfactant Ethanol in different compositions, and water was used as an additive for making it crystal clear transparent micro emulsion fuel, which is thermodynamically stable. By comparing the performances of engines, the power has shown similarity for micro emulsion fuel and conventional Gasoline fuel. On the other hand, Torque and BMEP shows increase for all the micro emulsion fuels. Micro emulsion fuel shows higher thermal efficiency and lower Specific Fuel Consumption for all the compositions as compared to the Gasoline fuel. Carbon monoxide and Hydro carbon emissions were also measured. The result shows that emissions decrease for all the composition of micro emulsion fuels, and proved to be the most efficient fuel both in terms of performance and emission characteristics.

Keywords: AVL Boost, emissions, microemulsions, performance, Spark Ignition (SI) engine

Procedia PDF Downloads 259
1514 Effects of Allowance for Corporate Equity on the Financing Choices of Belgian Small and Medium-Sized Enterprises in a Crisis Context

Authors: O. Colot, M. Croquet, L. Cultrera, Y. Fandja Collince

Abstract:

The objective of our research is to evaluate the impact of the allowance for corporate equity (ACE) on the financial structure of Belgian SME in order to highlight the potential existence of a fiscal leverage. To limit the biases linked to the rationing of the capital further to the financial crisis, we compare first the dynamic evolution of the financial structure of the Belgian firms over the period 2006-2015 by focusing on three sub-periods: 2006-2008, 2009-2012 and 2013-2015. We give then an international size to this comparison by including SMEs from countries adjoining Belgium (France, Germany, Netherlands and the United Kingdom) and within which there is no ACE. This comparison allows better understanding the fiscal advantage linked to the ACE of firms evolving in a relatively unstable economic environment further to the financial crisis of 2008. This research is relevant given the economic and political context in which Belgium operates and the very uncertain future of the Belgian ACE. The originality of this research is twofold: the long study period and the consideration of the effects of the financial and economic crisis on the financing structure of Belgian SMEs. The results of this research, even though they confirm the existence of a positive fiscal leverage for the tax deduction for venture capital on the financing structure of Belgian SMEs, do not allow the extent of this leverage to be clearly quantified. The comparative evolution of financing structures over the period 2006-2015 of Belgian, French, German, Dutch and English SMEs shows a strong similarity in the overall evolution of their financing.

Keywords: allowance for corporate equity, Belgium, financial structure, small and medium sized firms

Procedia PDF Downloads 199
1513 Multilevel Gray Scale Image Encryption through 2D Cellular Automata

Authors: Rupali Bhardwaj

Abstract:

Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.

Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient

Procedia PDF Downloads 372
1512 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 448
1511 Localized and Time-Resolved Velocity Measurements of Pulsatile Flow in a Rectangular Channel

Authors: R. Blythman, N. Jeffers, T. Persoons, D. B. Murray

Abstract:

The exploitation of flow pulsation in micro- and mini-channels is a potentially useful technique for enhancing cooling of high-end photonics and electronics systems. It is thought that pulsation alters the thickness of the hydrodynamic and thermal boundary layers, and hence affects the overall thermal resistance of the heat sink. Although the fluid mechanics and heat transfer are inextricably linked, it can be useful to decouple the parameters to better understand the mechanisms underlying any heat transfer enhancement. Using two-dimensional, two-component particle image velocimetry, the current work intends to characterize the heat transfer mechanisms in pulsating flow with a mean Reynolds number of 48 by experimentally quantifying the hydrodynamics of a generic liquid-cooled channel geometry. Flows circulated through the test section by a gear pump are modulated using a controller to achieve sinusoidal flow pulsations with Womersley numbers of 7.45 and 2.36 and an amplitude ratio of 0.75. It is found that the transient characteristics of the measured velocity profiles are dependent on the speed of oscillation, in accordance with the analytical solution for flow in a rectangular channel. A large velocity overshoot is observed close to the wall at high frequencies, resulting from the interaction of near-wall viscous stresses and inertial effects of the main fluid body. The steep velocity gradients at the wall are indicative of augmented heat transfer, although the local flow reversal may reduce the upstream temperature difference in heat transfer applications. While unsteady effects remain evident at the lower frequency, the annular effect subsides and retreats from the wall. The shear rate at the wall is increased during the accelerating half-cycle and decreased during deceleration compared to steady flow, suggesting that the flow may experience both enhanced and diminished heat transfer during a single period. Hence, the thickness of the hydrodynamic boundary layer is reduced for positively moving flow during one half of the pulsation cycle at the investigated frequencies. It is expected that the size of the thermal boundary layer is similarly reduced during the cycle, leading to intervals of heat transfer enhancement.

Keywords: Heat transfer enhancement, particle image velocimetry, localized and time-resolved velocity, photonics and electronics cooling, pulsating flow, Richardson’s annular effect

Procedia PDF Downloads 345
1510 Appearance-Based Discrimination in a Workplace: An Emerging Problem for Labor Law Relationships

Authors: Irmina Miernicka

Abstract:

Nowadays, dress codes and widely understood appearance are becoming more important in the workplace. They are often used in the workplace to standardize image of an employer, to communicate a corporate image and ensure that customers can easily identify it. It is also a way to build professionalism of employer. Additionally, in many cases, an employer will introduce a dress code for health and safety reasons. Employers more often oblige employees to follow certain rules concerning their clothing, grooming, make-up, body art or even weight. An important research problem is to find the limits of the employer's interference with the external appearance of employees. They are primarily determined by the two main obligations of the employer, i. e. the obligation to respect the employee's personal rights and the principle of equal treatment and non-discrimination in employment. It should also be remembered that the limits of the employer's interference will be different when certain rules concerning the employee's appearance result directly from the provisions of laws and other acts of universally binding law (workwear, official clothing, and uniform). The analysis of this issue was based on literature and jurisprudence, both domestic and foreign, including the U.S. and European case law, and led the author to put forward a thesis that there are four main principles, which will protect the employer from the allegation of discrimination. First, it is the principle of adequacy - the means requirements regarding dress code must be appropriate to the position and type of work performed by the employee. Secondly, in accordance with the purpose limitation principle, an employer may introduce certain requirements regarding the appearance of employees if there is a legitimate, objective justification for this (such as work safety or type of work performed), not dictated by the employer's subjective feelings and preferences. Thirdly, these requirements must not place an excessive burden on workers and be disproportionate in relation to the employer's objective (principle of proportionality). Fourthly, the employer should also ensure that the requirements imposed in the workplace are equally burdensome and enforceable from all groups of employees. Otherwise, it may expose itself to grounds of discrimination based on sex or age. At the same time, it is also possible to differentiate the situation of some employees if these differences are small and reflect established habits and traditions and if employees are obliged to maintain the same level of professionalism in their positions. Although this subject may seem to be insignificant, frequent application of dress codes and increasing awareness of both employees and employers indicate that its legal aspects need to be thoroughly analyzed. Many legal cases brought before U.S. and European courts show that employees look for legal protection when they consider that their rights are violated by dress code introduced in a workplace.

Keywords: labor law, the appearance of an employee, discrimination in the workplace, dress code in a workplace

Procedia PDF Downloads 122
1509 Formal Ontology of Quality Space. Location, Subordination and Determination

Authors: Claudio Calosi, Damiano Costa, Paolo Natali

Abstract:

Determination is the relation that holds between certain kinds of properties, determinables – such as “being colored”, and others, determinates – such as “being red”. Subordination is the relation that holds between genus properties – such as “being an animal”, and others, species properties – such as “being human”'. It is widely held that Determination and Subordination share important similarities, yet also crucial differences. But what grounds such similarities and differences? This question is hardly ever addressed. The present paper provides the first step towards filling this gap in the literature. It argues that a locational theory of instantiation, roughly the view that to have a property is to occupy a location in quality space, holds the key for such an answer. More precisely, it argues that both principles of Determination and Subordination are just examples of more general principles of location. Consider Determination. The principle that everything that has a determinate has a determinable boils down to the claim that everything that has a precise location in quality space is in quality space – an eminently reasonable principle. The principle that nothing can have two determinates (at the same level of determination) boils down to the principle that nothing can be “multilocated” in quality space. In effect, the following provides a “translation table” between principles of location and determination: LOCATION DETERMINATION Functionality At Most One Determination Focus At Most One Determination & Requisite Determination* Exactness Requisite Determination* Super-Exactness Requisite Determination Exactitude Requisite Determination Converse-Exactness Determinable Inehritance This grounds the similarity between Determination and Subordination. What about the differences? The paper argues that the differences boil down to the mereological structure of the regions that are occupied in quality space, in particular whether they are simple or complex. The key technical detail is that Determination and Subordination induce a “set-theoretic rooted tree” structure over the domain of properties. Interestingly, the analysis also provides a possible justification for the Aristotelian claim that being is not a genus property – an argument that the paper develops in some detail.

Keywords: determinables/determinates, genus/species, location, Aristotle on being is not a genus

Procedia PDF Downloads 75
1508 Fabrication of Hybrid Scaffolds Consisting of Cell-laden Electrospun Micro/Nanofibers and PCL Micro-structures for Tissue Regeneration

Authors: MyungGu Yeo, JongHan Ha, Gi-Hoon Yang, JaeYoon Lee, SeungHyun Ahn, Hyeongjin Lee, HoJun Jeon, YongBok Kim, Minseong Kim, GeunHyung Kim

Abstract:

Tissue engineering is a rapidly growing interdisciplinary research area that may provide options for treating damaged tissues and organs. As a promising technique for regenerating various tissues, this technology requires biomedical scaffolds, which serve as an artificial extracellular matrix (ECM) to support neotissue growth. Electrospun micro/nanofibers have been used widely in tissue engineering because of their high surface-area-to-volume ratio and structural similarity to extracellular matrix. However, low mechanical sustainability, low 3D shape-ability, and low cell infiltration have been major limitations to their use. In this work, we propose new hybrid scaffolds interlayered with cell-laden electrospun micro/nano fibers and poly(caprolactone) microstructures. Also, we applied various concentrations of alginate and electric field strengths to determine optimal conditions for the cell-electrospinning process. The combination of cell-laden bioink (2 ⅹ 10^5 osteoblast-like MG63 cells/mL, 2 wt% alginate, 2 wt% poly(ethylene oxide), and 0.7 wt% lecithin) and a 0.16 kV/mm electric field showed the highest cell viability and fiber formation in this process. Using these conditions and PCL microstructures, we achieved mechanically stable hybrid scaffolds. In addition, the cells embedded in the fibrous structure were viable and proliferated. We suggest that the cell-embedded hybrid scaffolds fabricated using the cell-electrospinning process may be useful for various soft- and hard-tissue regeneration applications.

Keywords: bioink, cell-laden scaffold, micro/nanofibers, poly(caprolactone)

Procedia PDF Downloads 377
1507 Participatory Cartography for Disaster Reduction in Pogreso, Yucatan Mexico

Authors: Gustavo Cruz-Bello

Abstract:

Progreso is a coastal community in Yucatan, Mexico, highly exposed to floods produced by severe storms and tropical cyclones. A participatory cartography approach was conducted to help to reduce floods disasters and assess social vulnerability within the community. The first step was to engage local authorities in risk management to facilitate the process. Two workshop were conducted, in the first, a poster size printed high spatial resolution satellite image of the town was used to gather information from the participants: eight women and seven men, among them construction workers, students, government employees and fishermen, their ages ranged between 23 and 58 years old. For the first task, participants were asked to locate emblematic places and place them in the image to familiarize with it. Then, they were asked to locate areas that get flooded, the buildings that they use as refuges, and to list actions that they usually take to reduce vulnerability, as well as to collectively come up with others that might reduce disasters. The spatial information generated at the workshops was digitized and integrated into a GIS environment. A printed version of the map was reviewed by local risk management experts, who validated feasibility of proposed actions. For the second workshop, we retrieved the information back to the community for feedback. Additionally a survey was applied in one household per block in the community to obtain socioeconomic, prevention and adaptation data. The information generated from the workshops was contrasted, through T and Chi Squared tests, with the survey data in order to probe the hypothesis that poorer or less educated people, are less prepared to face floods (more vulnerable) and live near or among higher presence of floods. Results showed that a great majority of people in the community are aware of the hazard and are prepared to face it. However, there was not a consistent relationship between regularly flooded areas with people’s average years of education, house services, or house modifications against heavy rains to be prepared to hazards. We could say that the participatory cartography intervention made participants aware of their vulnerability and made them collectively reflect about actions that can reduce disasters produced by floods. They also considered that the final map could be used as a communication and negotiation instrument with NGO and government authorities. It was not found that poorer and less educated people are located in areas with higher presence of floods.

Keywords: climate change, floods, Mexico, participatory mapping, social vulnerability

Procedia PDF Downloads 112
1506 Image Processing and Calculation of NGRDI Embedded System in Raspberry

Authors: Efren Lopez Jimenez, Maria Isabel Cajero, J. Irving-Vasqueza

Abstract:

The use and processing of digital images have opened up new opportunities for the resolution of problems of various kinds, such as the calculation of different vegetation indexes, among other things, differentiating healthy vegetation from humid vegetation. However, obtaining images from which these indexes are calculated is still the exclusive subject of active research. In the present work, we propose to obtain these images using a low cost embedded system (Raspberry Pi) and its processing, using a set of libraries of open code called OpenCV, in order to obtain the Normalized Red-Green Difference Index (NGRDI).

Keywords: Raspberry Pi, vegetation index, Normalized Red-Green Difference Index (NGRDI), OpenCV

Procedia PDF Downloads 287
1505 Assessment and Analysis of Literary Criticism and Consumer Research

Authors: Mohammad Mirzaei

Abstract:

This article proposes literary criticism as a source of insight into consumer behavior, provides an extensive overview of literary criticism, provides concrete illustrative analysis, and offers suggestions for further research. To do, a literary analysis of advertising copy identifies elements that provide additional information to consumer researchers and discusses the contribution of literary criticism to consumer research. Important post-war critical schools of thought are reviewed, and relevant theoretical concepts are summarized. Ivory Flakes' advertisements are analyzed using a variety of concepts drawn from literary schools, primarily sociocultural and reader responses. Suggestions for further research on content analysis, image analysis, and consumption history are presented.

Keywords: consumer behaviour, consumer research, consumption history, criticism

Procedia PDF Downloads 96