Search results for: similarity measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3869

Search results for: similarity measure

3719 GPU Accelerated Fractal Image Compression for Medical Imaging in Parallel Computing Platform

Authors: Md. Enamul Haque, Abdullah Al Kaisan, Mahmudur R. Saniat, Aminur Rahman

Abstract:

In this paper, we have implemented both sequential and parallel version of fractal image compression algorithms using CUDA (Compute Unified Device Architecture) programming model for parallelizing the program in Graphics Processing Unit for medical images, as they are highly similar within the image itself. There is several improvements in the implementation of the algorithm as well. Fractal image compression is based on the self similarity of an image, meaning an image having similarity in majority of the regions. We take this opportunity to implement the compression algorithm and monitor the effect of it using both parallel and sequential implementation. Fractal compression has the property of high compression rate and the dimensionless scheme. Compression scheme for fractal image is of two kinds, one is encoding and another is decoding. Encoding is very much computational expensive. On the other hand decoding is less computational. The application of fractal compression to medical images would allow obtaining much higher compression ratios. While the fractal magnification an inseparable feature of the fractal compression would be very useful in presenting the reconstructed image in a highly readable form. However, like all irreversible methods, the fractal compression is connected with the problem of information loss, which is especially troublesome in the medical imaging. A very time consuming encoding process, which can last even several hours, is another bothersome drawback of the fractal compression.

Keywords: accelerated GPU, CUDA, parallel computing, fractal image compression

Procedia PDF Downloads 332
3718 Macroeconomic Measure of Projectification: An Empirical Study of Pakistani Economy

Authors: Shafaq Rana, Hina Ansar

Abstract:

Projectification is an emerging phenomenon in Western economies. The projects have become the key driver of the economic actions. The impact of projectification is understudy for over a decade. A methodology was developed to measure the degree of projectification at economical level, which was later adapted to measure the degree of projectification in Germany, Norway, and Iceland; and compared the differences in these project societies, considering their industrial structure, organizational size, and the share of project work. Using the same methodology, this study aims to provide empirical evidence of the project work in the context of Pakistan –a developing nation, keeping into consideration the macroeconomic measures, qualitative and quantitative measures of the project i/c GDP, monetary measures, and project success. The research includes a qualitative pre-study to define these macro-measures in the country-specific context and a quantitative study to measure the project work w.r.t hours working in the organizations on projects. The outcome of this study provides the key data on the projectification in a developing economy, which will help industry practitioners and decision-makers to examine the consequences of projectification and strategize, respectively. This study also provides a foundation for further research in individual sectors of the country while exploring different macroeconomic questions, including the effect of projectification on project productivity, income effects, and labor market.

Keywords: developing economy, Pakistan, project work, projectification

Procedia PDF Downloads 114
3717 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank will be focused first when developing their test cases as these modules are vulnerable to defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: software testing, mutation test, network centrality measure, test case prioritization

Procedia PDF Downloads 111
3716 CompPSA: A Component-Based Pairwise RNA Secondary Structure Alignment Algorithm

Authors: Ghada Badr, Arwa Alturki

Abstract:

The biological function of an RNA molecule depends on its structure. The objective of the alignment is finding the homology between two or more RNA secondary structures. Knowing the common functionalities between two RNA structures allows a better understanding and a discovery of other relationships between them. Besides, identifying non-coding RNAs -that is not translated into a protein- is a popular application in which RNA structural alignment is the first step A few methods for RNA structure-to-structure alignment have been developed. Most of these methods are partial structure-to-structure, sequence-to-structure, or structure-to-sequence alignment. Less attention is given in the literature to the use of efficient RNA structure representation and the structure-to-structure alignment methods are lacking. In this paper, we introduce an O(N2) Component-based Pairwise RNA Structure Alignment (CompPSA) algorithm, where structures are given as a component-based representation and where N is the maximum number of components in the two structures. The proposed algorithm compares the two RNA secondary structures based on their weighted component features rather than on their base-pair details. Extensive experiments are conducted illustrating the efficiency of the CompPSA algorithm when compared to other approaches and on different real and simulated datasets. The CompPSA algorithm shows an accurate similarity measure between components. The algorithm gives the flexibility for the user to align the two RNA structures based on their weighted features (position, full length, and/or stem length). Moreover, the algorithm proves scalability and efficiency in time and memory performance.

Keywords: alignment, RNA secondary structure, pairwise, component-based, data mining

Procedia PDF Downloads 456
3715 Stray Light Reduction Methodology by a Sinusoidal Light Modulation and Three-Parameter Sine Curve Fitting Algorithm for a Reflectance Spectrometer

Authors: Hung Chih Hsieh, Cheng Hao Chang, Yun Hsiang Chang, Yu Lin Chang

Abstract:

In the applications of the spectrometer, the stray light that comes from the environment affects the measurement results a lot. Hence, environment and instrument quality control for the stray reduction is critical for the spectral reflectance measurement. In this paper, a simple and practical method has been developed to correct a spectrometer's response for measurement errors arising from the environment's and instrument's stray light. A sinusoidal modulated light intensity signal was incident on a tested sample, and then the reflected light was collected by the spectrometer. Since a sinusoidal signal modulated the incident light, the reflected light also had a modulated frequency which was the same as the incident signal. Using the three-parameter sine curve fitting algorithm, we can extract the primary reflectance signal from the total measured signal, which contained the primary reflectance signal and the stray light from the environment. The spectra similarity between the extracted spectra by this proposed method with extreme environment stray light is 99.98% similar to the spectra without the environment's stray light. This result shows that we can measure the reflectance spectra without the affection of the environment's stray light.

Keywords: spectrometer, stray light, three-parameter sine curve fitting, spectra extraction

Procedia PDF Downloads 246
3714 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje

Abstract:

Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

Keywords: indexing, retrieval, multimedia, graph algorithm, graph code

Procedia PDF Downloads 158
3713 Impact of Fire on Bird Diversity in Oil Palm Plantation: Case Study in South Sumatra Province

Authors: Yanto Santosa, Windi Sugiharti

Abstract:

Fires occur annually in oil palm plantations. The objective of the study was to identify the impact of fire on bird diversity in oil palm plantations. Data of bird diversity were collected using the line transect method. Data were collected from February to March 2017. To estimate species richness, we used the Margalef index, to determine the evenness of species richness between site, we used an Evenness index, and to estimate the similarity of bird communities between different habitat, we used the Sørensen index. The result showed that the number of bird species and species richness in the post burned area was higher than those in unburned area. Different results were found for the Evenness Index, where the value was higher in unburned area that was in post burned area. These results indicate that fires did not decrease bird diversity as alleged by many parties whom stated that fires caused species extinction. Fire trigger the emerging of belowground plant and population of insects as a sources of food for the bird community. This result is consistent with several research findings in the United States and Australia that used controlled fires as one of regional management tools.

Keywords: bird, fire, index of similarity, oil palm, species diversity

Procedia PDF Downloads 246
3712 Genetic Variation among the Wild and Hatchery Raised Populations of Labeo rohita Revealed by RAPD Markers

Authors: Fayyaz Rasool, Shakeela Parveen

Abstract:

The studies on genetic diversity of Labeo rohita by using molecular markers were carried out to investigate the genetic structure by RAPAD marker and the levels of polymorphism and similarity amongst the different groups of five populations of wild and farmed types. The samples were collected from different five locations as representatives of wild and hatchery raised populations. RAPAD data for Jaccard’s coefficient by following the un-weighted Pair Group Method with Arithmetic Mean (UPGMA) for Hierarchical Clustering of the similar groups on the basis of similarity amongst the genotypes and the dendrogram generated divided the randomly selected individuals of the five populations into three classes/clusters. The variance decomposition for the optimal classification values remained as 52.11% for within class variation, while 47.89% for the between class differences. The Principal Component Analysis (PCA) for grouping of the different genotypes from the different environmental conditions was done by Spearman Varimax rotation method for bi-plot generation of the co-occurrence of the same genotypes with similar genetic properties and specificity of different primers indicated clearly that the increase in the number of factors or components was correlated with the decrease in eigenvalues. The Kaiser Criterion based upon the eigenvalues greater than one, first two main factors accounted for 58.177% of cumulative variability.

Keywords: variation, clustering, PCA, wild, hatchery, RAPAD, Labeo rohita

Procedia PDF Downloads 447
3711 Is It Important to Measure the Volumetric Mass Density of Nanofluids?

Authors: Z. Haddad, C. Abid, O. Rahli, O. Margeat, W. Dachraoui, A. Mataoui

Abstract:

The present study aims to measure the volumetric mass density of NiPd-heptane nanofluids synthesized using a one-step method known as thermal decomposition of metal-surfactant complexes. The particle concentration is up to 7.55 g/l and the temperature range of the experiment is from 20°C to 50°C. The measured values were compared with the mixture theory and good agreement between the theoretical equation and measurement were obtained. Moreover, the available nanofluids volumetric mass density data in the literature is reviewed.

Keywords: NiPd nanoparticles, nanofluids, volumetric mass density, stability

Procedia PDF Downloads 399
3710 Robust Medical Image Watermarking based on Contourlet and Extraction Using ICA

Authors: S. Saju, G. Thirugnanam

Abstract:

In this paper, a medical image watermarking algorithm based on contourlet is proposed. Medical image watermarking is a special subcategory of image watermarking in the sense that images have special requirements. Watermarked medical images should not differ perceptually from their original counterparts because clinical reading of images must not be affected. Watermarking techniques based on wavelet transform are reported in many literatures but robustness and security using contourlet are better when compared to wavelet transform. The main challenge in exploring geometry in images comes from the discrete nature of the data. In this paper, original image is decomposed to two level using contourlet and the watermark is embedded in the resultant sub-bands. Sub-band selection is based on the value of Peak Signal to Noise Ratio (PSNR) that is calculated between watermarked and original image. To extract the watermark, Kernel ICA is used and it has a novel characteristic is that it does not require the transformation process to extract the watermark. Simulation results show that proposed scheme is robust against attacks such as Salt and Pepper noise, Median filtering and rotation. The performance measures like PSNR and Similarity measure are evaluated and compared with Discrete Wavelet Transform (DWT) to prove the robustness of the scheme. Simulations are carried out using Matlab Software.

Keywords: digital watermarking, independent component analysis, wavelet transform, contourlet

Procedia PDF Downloads 527
3709 Study on the Geometric Similarity in Computational Fluid Dynamics Calculation and the Requirement of Surface Mesh Quality

Authors: Qian Yi Ooi

Abstract:

At present, airfoil parameters are still designed and optimized according to the scale of conventional aircraft, and there are still some slight deviations in terms of scale differences. However, insufficient parameters or poor surface mesh quality is likely to occur if these small deviations are embedded in a future civil aircraft with a size that is quite different from conventional aircraft, such as a blended-wing-body (BWB) aircraft with future potential, resulting in large deviations in geometric similarity in computational fluid dynamics (CFD) simulations. To avoid this situation, the study on the CFD calculation on the geometric similarity of airfoil parameters and the quality of the surface mesh is conducted to obtain the ability of different parameterization methods applied on different airfoil scales. The research objects are three airfoil scales, including the wing root and wingtip of conventional civil aircraft and the wing root of the giant hybrid wing, used by three parameterization methods to compare the calculation differences between different sizes of airfoils. In this study, the constants including NACA 0012, a Reynolds number of 10 million, an angle of attack of zero, a C-grid for meshing, and the k-epsilon (k-ε) turbulence model are used. The experimental variables include three airfoil parameterization methods: point cloud method, B-spline curve method, and class function/shape function transformation (CST) method. The airfoil dimensions are set to 3.98 meters, 17.67 meters, and 48 meters, respectively. In addition, this study also uses different numbers of edge meshing and the same bias factor in the CFD simulation. Studies have shown that with the change of airfoil scales, different parameterization methods, the number of control points, and the meshing number of divisions should be used to improve the accuracy of the aerodynamic performance of the wing. When the airfoil ratio increases, the most basic point cloud parameterization method will require more and larger data to support the accuracy of the airfoil’s aerodynamic performance, which will face the severe test of insufficient computer capacity. On the other hand, when using the B-spline curve method, average number of control points and meshing number of divisions should be set appropriately to obtain higher accuracy; however, the quantitative balance cannot be directly defined, but the decisions should be made repeatedly by adding and subtracting. Lastly, when using the CST method, it is found that limited control points are enough to accurately parameterize the larger-sized wing; a higher degree of accuracy and stability can be obtained by using a lower-performance computer.

Keywords: airfoil, computational fluid dynamics, geometric similarity, surface mesh quality

Procedia PDF Downloads 220
3708 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 155
3707 A Study of Topical and Similarity of Sebum Layer Using Interactive Technology in Image Narratives

Authors: Chao Wang

Abstract:

Under rapid innovation of information technology, the media plays a very important role in the dissemination of information, and it has a totally different analogy generations face. However, the involvement of narrative images provides more possibilities of narrative text. "Images" through the process of aperture, a camera shutter and developable photosensitive processes are manufactured, recorded and stamped on paper, displayed on a computer screen-concretely saved. They exist in different forms of files, data, or evidence as the ultimate looks of events. By the interface of media and network platforms and special visual field of the viewer, class body space exists and extends out as thin as sebum layer, extremely soft and delicate with real full tension. The physical space of sebum layer of confuses the fact that physical objects exist, needs to be established under a perceived consensus. As at the scene, the existing concepts and boundaries of physical perceptions are blurred. Sebum layer physical simulation shapes the “Topical-Similarity" immersing, leading the contemporary social practice communities, groups, network users with a kind of illusion without the presence, i.e. a non-real illusion. From the investigation and discussion of literatures, digital movies editing manufacture and produce the variability characteristics of time (for example, slices, rupture, set, and reset) are analyzed. Interactive eBook has an unique interaction in "Waiting-Greeting" and "Expectation-Response" that makes the operation of image narrative structure more interpretations functionally. The works of digital editing and interactive technology are combined and further analyze concept and results. After digitization of Interventional Imaging and interactive technology, real events exist linked and the media handing cannot be cut relationship through movies, interactive art, practical case discussion and analysis. Audience needs more rational thinking about images carried by the authenticity of the text.

Keywords: sebum layer, topical and similarity, interactive technology, image narrative

Procedia PDF Downloads 388
3706 Global Voltage Harmonic Index for Measuring Harmonic Situation of Power Grids: A Focus on Power Transformers

Authors: Alireza Zabihi, Saeed Peyghami, Hossein Mokhtari

Abstract:

With the increasing deployment of renewable power plants, such as solar and wind, it is crucial to measure the harmonic situation of the grid. This paper proposes a global voltage harmonic index to measure the harmonic situation of the power grid with a focus on power transformers. The power electronics systems used to connect these plants to the network can introduce harmonics, leading to increased losses, reduced efficiency, false operation of protective relays, and equipment damage due to harmonic intensifications. The proposed index considers the losses caused by harmonics in power transformers which are of great importance and value to the network, providing a comprehensive measure of the harmonic situation of the grid. The effectiveness of the proposed index is evaluated on a real-world distribution network, and the results demonstrate its ability to identify the harmonic situation of the network, particularly in relation to power transformers. The proposed index provides a comprehensive measure of the harmonic situation of the grid, taking into account the losses caused by harmonics in power transformers. The proposed index has the potential to support power companies in optimizing their power systems and to guide researchers in developing effective mitigation strategies for harmonics in the power grid.

Keywords: global voltage harmonic index, harmonics, power grid, power quality, power transformers, renewable energy

Procedia PDF Downloads 125
3705 Optimal Design of Redundant Hybrid Manipulator for Minimum Singularity

Authors: Arash Rahmani, Ahmad Ghanbari, Abbas Baghernezhad, Babak Safaei

Abstract:

In the design of parallel manipulators, usually mean value of a dexterity measure over the workspace volume is considered as the objective function to be used in optimization algorithms. The mentioned indexes in a hybrid parallel manipulator (HPM) are quite complicated to solve thanks to infinite solutions for every point within the workspace of the redundant manipulators. In this paper, spatial isotropic design axioms are extended as a well-known method for optimum design of manipulators. An upper limit for the isotropy measure of HPM is calculated and instead of computing and minimizing isotropy measure, minimizing the obtained limit is considered. To this end, two different objective functions are suggested which are obtained from objective functions of comprising modules. Finally, by using genetic algorithm (GA), the best geometric parameters for a specific hybrid parallel robot which is composed of two modified Gough-Stewart platforms (MGSP) are achieved.

Keywords: hybrid manipulator, spatial isotropy, genetic algorithm, optimum design

Procedia PDF Downloads 334
3704 3D Liver Segmentation from CT Images Using a Level Set Method Based on a Shape and Intensity Distribution Prior

Authors: Nuseiba M. Altarawneh, Suhuai Luo, Brian Regan, Guijin Tang

Abstract:

Liver segmentation from medical images poses more challenges than analogous segmentations of other organs. This contribution introduces a liver segmentation method from a series of computer tomography images. Overall, we present a novel method for segmenting liver by coupling density matching with shape priors. Density matching signifies a tracking method which operates via maximizing the Bhattacharyya similarity measure between the photometric distribution from an estimated image region and a model photometric distribution. Density matching controls the direction of the evolution process and slows down the evolving contour in regions with weak edges. The shape prior improves the robustness of density matching and discourages the evolving contour from exceeding liver’s boundaries at regions with weak boundaries. The model is implemented using a modified distance regularized level set (DRLS) model. The experimental results show that the method achieves a satisfactory result. By comparing with the original DRLS model, it is evident that the proposed model herein is more effective in addressing the over segmentation problem. Finally, we gauge our performance of our model against matrices comprising of accuracy, sensitivity and specificity.

Keywords: Bhattacharyya distance, distance regularized level set (DRLS) model, liver segmentation, level set method

Procedia PDF Downloads 312
3703 Investigation of the Role of Friction in Reducing Pedestrian Injuries in Accidents at Intersections

Authors: Seyed Abbas Tabatabaei, Afshin Ghanbarzadeh, Mehdi Abidizadeh

Abstract:

Nowadays the subject of road traffic accidents and the high social and economic costs due to them is the most fundamental problem that experts and providers of transport and traffic brought to a challenge. One of the most effective measures is to enhance the skid resistance of road surface. This research aims to study the intersection of one case in Ahwaz and the effect of increasing the skid resistance in reducing pedestrian injuries in accidents at intersections. In this research the device was developed to measure the coefficient of friction and tried the rules and practices of it have a high similarity with the Locked Wheel Trailer. This device includes a steel frame, wheels, hydration systems, and force gauge. The output of the device is that the force gauge registers. By investigate this data and applying the relationships relative surface coefficient of friction is obtained. Friction coefficient data for the current state and the state of the new pavement are obtained and plotted on the graphs based on the graphs we can compare the two situations and speed at the moment of collision between the two modes are compared. The results show that increasing the coefficient of friction to what extent can be effective on the severity and number of accidents.

Keywords: intersection, coefficient of friction, skid resistance, locked wheels, accident, pedestrian

Procedia PDF Downloads 326
3702 Self-Supervised Learning for Hate-Speech Identification

Authors: Shrabani Ghosh

Abstract:

Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.

Keywords: attention learning, language model, offensive language detection, self-supervised learning

Procedia PDF Downloads 103
3701 ED Machining of Particulate Reinforced Metal Matrix Composites

Authors: Sarabjeet Singh Sidhu, Ajay Batish, Sanjeev Kumar

Abstract:

This paper reports the optimal process conditions for machining of three different types of metal matrix composites (MMCs): 65vol%SiC/A356.2; 10vol%SiC-5vol%quartz/Al and 30vol%SiC/A359 using PMEDM process. Metal removal rate (MRR), tool wear rate (TWR), surface roughness (SR) and surface integrity (SI) were evaluated after each trial and contributing process parameters were identified. The four responses were then collectively optimized using the technique for order preference by similarity to ideal solution (TOPSIS) and optimal process conditions were identified for each type of MMCS. The density of reinforced particles shields the matrix material from spark energy hence the high MRR and SR was observed with lowest reinforced particle. TWR was highest with Cu-Gr electrode due to disintegration of the weakly bonded particles in the composite electrode. Each workpiece was examined for surface integrity and ranked as per severity of surface defects observed and their rankings were used for arriving at the most optimal process settings for each workpiece.

Keywords: metal matrix composites (MMCS), metal removal rate (MRR), surface roughness (SR), surface integrity (SI), tool wear rate (TWR), technique for order preference by similarity to ideal solution (TOPSIS)

Procedia PDF Downloads 289
3700 An Attempt to Measure Afro-Polychronism Empirically

Authors: Aïda C. Terblanché-Greeff

Abstract:

Afro-polychronism is a unique amalgamated cultural value of social self-construal and time orientation. As such, the construct Afro-polychronism is conceptually analysed by focusing on the aspects of Ubuntu as collectivism and African time as polychronism. It is argued that these cultural values have a reciprocal and thus inseparable relationship. As it is general practice to measure cultural values empirically, the author conducted empirically engaged philosophy and aimed to develop a scale to measure Afro-polychronism based on its two dimensions of Ubuntu as social self-construal and African time as time orientation. From the scale’s psychometric properties, it was determined that the scale was, in fact, not reliable and valid. It was found that the correlation between the Ubuntu dimension and the African time is moderate (albeit statistically significant). In conclusion, the author abduced why this cultural value cannot be empirically measured based on its theoretical definition and indicated which different path would be more promising.

Keywords: African time, Afro-polychronism, empirically engaged African philosophy, Ubuntu

Procedia PDF Downloads 142
3699 Enhancing Word Meaning Retrieval Using FastText and Natural Language Processing Techniques

Authors: Sankalp Devanand, Prateek Agasimani, Shamith V. S., Rohith Neeraje

Abstract:

Machine translation has witnessed significant advancements in recent years, but the translation of languages with distinct linguistic characteristics, such as English and Sanskrit, remains a challenging task. This research presents the development of a dedicated English-to-Sanskrit machine translation model, aiming to bridge the linguistic and cultural gap between these two languages. Using a variety of natural language processing (NLP) approaches, including FastText embeddings, this research proposes a thorough method to improve word meaning retrieval. Data preparation, part-of-speech tagging, dictionary searches, and transliteration are all included in the methodology. The study also addresses the implementation of an interpreter pattern and uses a word similarity task to assess the quality of word embeddings. The experimental outcomes show how the suggested approach may be used to enhance word meaning retrieval tasks with greater efficacy, accuracy, and adaptability. Evaluation of the model's performance is conducted through rigorous testing, comparing its output against existing machine translation systems. The assessment includes quantitative metrics such as BLEU scores, METEOR scores, Jaccard Similarity, etc.

Keywords: machine translation, English to Sanskrit, natural language processing, word meaning retrieval, fastText embeddings

Procedia PDF Downloads 42
3698 Assessing Knowledge Management Impacts: Challenges, Limits and Base for a New Framework

Authors: Patrick Mbassegue, Mickael Gardoni

Abstract:

In a market environment centered more and more on services and the digital economy, knowledge management becomes a framework that can help organizations to create value and to improve their overall performance. Based on an optimal allocation of scarce resources, managers are interested in demonstrating the added value generated by knowledge management projects. One of the challenges faced by organizations is the difficulty in measuring impacts and concrete results of knowledge management initiatives. The present article concerns the measure of concrete results coming from knowledge management projects based on balance scorecard model. One of the goals is to underline what can be done based on this model but also to highlight the limits associated. The present article is structured in five parts; 1-knowledge management projects and organizational impacts; 2- a framework and a methodology to measure organizational impacts; 3- application illustrated in two case studies; 4- limits concerning the proposed framework; 5- the proposal of a new framework to measure organizational impacts.

Keywords: knowledge management, project, balance scorecard, impacts

Procedia PDF Downloads 261
3697 Investigating the Relationship between Place Attachment and Sustainable Development of Urban Spaces

Authors: Hamid Reza Zeraatpisheh, Ali Akbar Heidari, Soleiman Mohammadi Doust

Abstract:

This study has examined the relationship between place attachment and sustainable development of urban spaces. To perform this, the components of place identity, emotional attachment, place attachment and social bonding which totally constitute the output of place attachment, by means of the standardized questionnaire measure place attachment in three domains of (cognitive) the place identity, (affective) emotional attachment and (behavioral) place attachment and social bonding. To measure sustainable development, three components of sustainable development, including society, economy and environment has been considered. The study is descriptive. The assessment instrument is the standard questionnaire of Safarnia which has been used to measure the variable of place attachment and to measure the variable of sustainable development, a questionnaire has been made by the researcher and been based on the combined theoretical framework. The statistical population of this research has been the city of Shiraz. The statistical sample has been Hafeziyeh. SPSS software has been used to analyze the data and examined the results of both descriptive and inferential statistics. In inferential statistics, Pearson correlation coefficient has been used to examine the hypotheses. In this study, the variable of place attachment is high and sustainable development is also in a high level. These results suggest a positive relationship between attachment to place and sustainable development.

Keywords: place attachment, sustainable development, economy-society-environment, Hafez's tomb

Procedia PDF Downloads 700
3696 Short Answer Grading Using Multi-Context Features

Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan

Abstract:

Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.

Keywords: artificial intelligence, intelligent systems, natural language processing, text mining

Procedia PDF Downloads 131
3695 Using “Eckel” Model to Measure Income Smoothing Practices: The Case of French Companies

Authors: Feddaoui Amina

Abstract:

Income smoothing represents an attempt on the part of the company's management to reduce variations in earnings through the manipulation of the accounting principles. In this study, we aimed to measure income smoothing practices in a sample of 30 French joint stock companies during the period (2007-2009), we used Dummy variables method and “ECKEL” model to measure income smoothing practices and Binomial test accourding to SPSS program, to confirm or refute our hypothesis. This study concluded that there are no significant statistical indicators of income smoothing practices in the sample studied of French companies during the period (2007-2009), so the income series in the same sample studied of is characterized by stability and non-volatility without any intervention of management through accounting manipulation. However, this type of accounting manipulation should be taken into account and efforts should be made by control bodies to apply Eckel model and generalize its use at the global level.

Keywords: income, smoothing, 'Eckel', French companies

Procedia PDF Downloads 149
3694 Network and Sentiment Analysis of U.S. Congressional Tweets

Authors: Chaitanya Kanakamedala, Hansa Pradhan, Carter Gilbert

Abstract:

Social media platforms, such as Twitter, are excellent datasets for understanding human interactions and sentiments. This report explores social dynamics among US Congressional members through a network analysis applied to a dataset of tweets spanning 2008 to 2017 from the ’US Congressional Tweets Dataset’. In this report, we preform network analysis where connections between users (edges) are established based on a similarity threshold: two tweets are connected if the tweets they post are similar. By utilizing the Natural Language Toolkit (NLTK) and NetworkX, we quantified tweet similarity and constructed a graph comprising various interconnected components. Each component represents a cluster of users with closely aligned content. We then preform sentiment analysis on each cluster to explore the prevalent emotions and opinions within these groups. Our findings reveal that despite the initial expectation of distinct ideological divisions typically aligning with party lines, the analysis exposed a high degree of topical convergence across tweets from different political affiliations. The analysis preformed in this report not only highlights the potential of social media as a tool for political communication but also suggests a complex layer of interaction that transcends traditional partisan boundaries, reflecting a complicated landscape of politics in the digital age.

Keywords: natural language processing, sentiment analysis, centrality analysis, topic modeling

Procedia PDF Downloads 32
3693 Passenger Flow Characteristics of Seoul Metropolitan Subway Network

Authors: Kang Won Lee, Jung Won Lee

Abstract:

Characterizing the network flow is of fundamental importance to understand the complex dynamics of networks. And passenger flow characteristics of the subway network are very relevant for an effective transportation management in urban cities. In this study, passenger flow of Seoul metropolitan subway network is investigated and characterized through statistical analysis. Traditional betweenness centrality measure considers only topological structure of the network and ignores the transportation factors. This paper proposes a weighted betweenness centrality measure that incorporates monthly passenger flow volume. We apply the proposed measure on the Seoul metropolitan subway network involving 493 stations and 16 lines. Several interesting insights about the network are derived from the new measures. Using Kolmogorov-Smirnov test, we also find out that monthly passenger flow between any two stations follows a power-law distribution and other traffic characteristics such as congestion level and throughflow traffic follow exponential distribution.

Keywords: betweenness centrality, correlation coefficient, power-law distribution, Korea traffic DB

Procedia PDF Downloads 289
3692 Thermodynamics during the Deconfining Phase Transition

Authors: Amal Ait El Djoudi

Abstract:

A thermodynamical model of coexisting hadronic and quark–gluon plasma (QGP) phases is used to study the thermally driven deconfining phase transition occurring between the two phases. A color singlet partition function is calculated for the QGP phase with two massless quarks, as in our previous work, but now the finite extensions of the hadrons are taken into account in the equation of state of the hadronic phase. In the present work, the finite-size effects on the system are examined by probing the behavior of some thermodynamic quantities, called response functions, as order parameter, energy density and their derivatives, on a range of temperature around the transition at different volumes. It turns out that the finiteness of the system size has as effects the rounding of the transition and the smearing of all the singularities occurring in the thermodynamic limit, and the additional finite-size effect introduced by the requirement of exact color-singletness involves a shift of the transition point. This shift as well as the smearing of the transition region and the maxima of both susceptibility and specific heat show a scaling behavior with the volume characterized by scaling exponents. Another striking result is the large similarity noted between the behavior of these response functions and that of the cumulants of the probability density. This similarity is worked to try to extract information concerning the occurring phase transition.

Keywords: equation of state, thermodynamics, deconfining phase transition, quark–gluon plasma (QGP)

Procedia PDF Downloads 426
3691 The Impact of Transformational Leadership and Interpersonal Interaction on Mentoring Function

Authors: Ching-Yuan Huang, Rhay-Hung Weng, Yi-Ting Chen

Abstract:

Mentoring functions will improve new nurses' job performance, provide support with new nurses, and then reduce the turnover rate of them. This study explored the impact of transformational leadership and interpersonal interaction on mentoring functions. We employed a questionnaire survey to collect data and selected a sample of new nurses from three hospitals in Taiwan. A total of 306 valid surveys were obtained. Multiple regression model analysis was conducted to test the study hypothesis. Inspirational motivation, idealized influence, and individualized consideration had a positive influence on overall mentoring function, but intellectual stimulation had a positive influence on career development function only. Perceived similarity and interaction frequency also had positive influences on mentoring functions. When the shift overlap rate exceeded 80%, mentoring function experienced a negative result. The transformational leadership of mentors actually would improve the mentoring functions among new staff nurses. Perceived similarity and interaction frequency between mentees and mentors also had a positive influence on mentoring functions. Managers should enhance the transformational leadership of mentors by designing leadership training and motivation programs. Furthermore, nursing managers should promote the interaction between new staff nurses and their mentors, but the shift overlap rate should not exceed 80%.

Keywords: interpersonal interaction, mentoring function, mentor, new nurse, transformational leadership

Procedia PDF Downloads 330
3690 Heart Rate Variability as a Measure of Dairy Calf Welfare

Authors: J. B. Clapp, S. Croarkin, C. Dolphin, S. K. Lyons

Abstract:

Chronic pain or stress in farm animals impacts both on their welfare and productivity. Measuring chronic pain or stress can be problematic using hormonal or behavioural changes because hormones are modulated by homeostatic mechanisms and observed behaviour can be highly subjective. We propose that heart rate variability (HRV) can quantify chronic pain or stress in farmed animal and represents a more robust and objective measure of their welfare.

Keywords: dairy calf, welfare, heart rate variability, non-invasive, biomonitor

Procedia PDF Downloads 597