Search results for: Face Processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2165

Search results for: Face Processing

1385 The Experience of Iranian Architecture in Direction of Urban Passages and Forming of Urban Structures to Increase Climatic Comfort

Authors: N. Utaberta, N. Sharifi, M. Surat, A. I. Che-Ani, N.M. Tawil

Abstract:

Iran has diverse climates and each have established distinct properties in their area. The extent and intensity of climatic factors effects on the lives of people living in various regions of Iran is so great that it cannot be simply ignored. In a large part of Iran known as the Central Plateau there is no precipitation for more than half of the year and dry weather and scarcity of fresh water pose an ever present problem for the people of these regions while in north of Iran upon the southern shores of the Caspian Sea the people face 80% humidity caused by the sea and 2 meters of annual precipitation. This article tries to review the past experiences of local architecture of Iran-s various regions so that they can be used to reshape and redirect the urban areas and structure of Iran-s current cities to provide environmental comfort by minimum use of fossil fuels.

Keywords: Urban Passage, Architecture in Iran, Urban Structure, Climatic Comfort

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
1384 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry

Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine

Abstract:

The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).

Keywords: Bottom elevation, multi-view stereo, river, structure-from-motion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
1383 Watermark-based Counter for Restricting Digital Audio Consumption

Authors: Mikko Löytynoja, Nedeljko Cvejic, Tapio Seppänen

Abstract:

In this paper we introduce three watermarking methods that can be used to count the number of times that a user has played some content. The proposed methods are tested with audio content in our experimental system using the most common signal processing attacks. The test results show that the watermarking methods used enable the watermark to be extracted under the most common attacks with a low bit error rate.

Keywords: Digital rights management, restricted usage, content protection, spread spectrum, audio watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
1382 Business Buyers’ Expectations in Buyer-Seller Encounters

Authors: Pia I. Hautamäki

Abstract:

Selling has changed. Selling has taken on aspects of relationship marketing and sales force play a critical role in developing long-term relationships between buyers and sellers which is seen to serve the company’s targets and create success for a long run. The purpose of this study was to examine what really matters in buyer-seller encounters and determine what expectations business buyers have. We studied 17 business buyers by a qualitative interview. We found that buyers appreciate encounters where the salesperson face the buyer as a way he or she is as a person, map the real needs to improve buyers’ business and build up cooperation for long-term relationship. This study show that personality matters are a key elements when satisfying business buyers’ expectations.

Keywords: Business-to-Business, Business buyer-seller encounters, Business buyer, Expectations, Perceived similarity, Personal selling, Personality types.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2377
1381 Performance Analysis of the Subgroup Method for Collective I/O

Authors: Kwangho Cha, Hyeyoung Cho, Sungho Kim

Abstract:

As many scientific applications require large data processing, the importance of parallel I/O has been increasingly recognized. Collective I/O is one of the considerable features of parallel I/O and enables application programmers to easily handle their large data volume. In this paper we measured and analyzed the performance of original collective I/O and the subgroup method, the way of using collective I/O of MPI effectively. From the experimental results, we found that the subgroup method showed good performance with small data size.

Keywords: Collective I/O, MPI, parallel file system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
1380 Time Temperature Dependence of Long Fiber Reinforced Polypropylene Manufactured by Direct Long Fiber Thermoplastic Process

Authors: K. A. Weidenmann, M. Grigo, B. Brylka, P. Elsner, T. Böhlke

Abstract:

In order to reduce fuel consumption, the weight of automobiles has to be reduced. Fiber reinforced polymers offer the potential to reach this aim because of their high stiffness to weight ratio. Additionally, the use of fiber reinforced polymers in automotive applications has to allow for an economic large-scale production. In this regard, long fiber reinforced thermoplastics made by direct processing offer both mechanical performance and processability in injection moulding and compression moulding. The work presented in this contribution deals with long glass fiber reinforced polypropylene directly processed in compression moulding (D-LFT). For the use in automotive applications both the temperature and the time dependency of the materials properties have to be investigated to fulfill performance requirements during crash or the demands of service temperatures ranging from -40 °C to 80 °C. To consider both the influence of temperature and time, quasistatic tensile tests have been carried out at different temperatures. These tests have been complemented by high speed tensile tests at different strain rates. As expected, the increase in strain rate results in an increase of the elastic modulus which correlates to an increase of the stiffness with decreasing service temperature. The results are in good accordance with results determined by dynamic mechanical analysis within the range of 0.1 to 100 Hz. The experimental results from different testing methods were grouped and interpreted by using different time temperature shift approaches. In this regard, Williams-Landel-Ferry and Arrhenius approach based on kinetics have been used. As the theoretical shift factor follows an arctan function, an empirical approach was also taken into consideration. It could be shown that this approach describes best the time and temperature superposition for glass fiber reinforced polypropylene manufactured by D-LFT processing.

Keywords: Composite, long fiber reinforced thermoplastics, mechanical properties, dynamic mechanical analysis, time temperature superposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1699
1379 Academic Mobbing in Turkey

Authors: E. Yelgecen Tigrel, O. Kokalan

Abstract:

People at workplace always face with stress and feel it in their lives. There are many factors that create stress and mobbing is one of them. Mobbing is a psychological terror, conducted systematically toward an individual by others at the same workplace. Mobbing started to become a famous subject last years in U.S and Europe. In Turkey, it is a new concept not because it does not occur, because of human nature that does not allow confessing it. Mobbing is being ignored by people, organizations and also government in our country. The focus of this study will be mobbing in Turkey by examining the workplace mobbing among Turkish academicians. There are other studies about mobbing in Turkey but none of them studied academy. Because mobbing methods change according to sectors and occupations, it is important to analyze each sector to understand the methods used in mobbing and the reactions of victims to these actions. The concept is analyzed in detail before focusing on mobbing at universities. This paper will be unique because there is no information about this specific subject in Turkish literature. In this paper, both qualitative and quantitative methods will be used to describe the mobbing at Turkish academic environment.

Keywords: Mobbing, Turkish academic environment, workplace problems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3762
1378 Structural and Optical Characterization of Silica@PbS Core–Shell Nanoparticles

Authors: A. Pourahmad, Sh. Gharipour

Abstract:

The present work describes the preparation and characterization of nanosized SiO2@PbS core-shell particles by using a simple wet chemical route. This method utilizes silica spheres formation followed by successive ionic layer adsorption and reaction method assisted lead sulphide shell layer formation. The final product was characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), UV–vis spectroscopic, infrared spectroscopy (IR) and transmission electron microscopy (TEM) experiments. The morphological studies revealed the uniformity in size distribution with core size of 250 nm and shell thickness of 18 nm. The electron microscopic images also indicate the irregular morphology of lead sulphide shell layer. The structural studies indicate the face-centered cubic system of PbS shell with no other trace for impurities in the crystal structure.

Keywords: Core-shell, nanostructure, semiconductor, optical property, XRD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1201
1377 Analysis of Socio-Cultural Obstacles for Dissemination of Nanotechnology from Iran's Agricultural Experts Perspective

Authors: S. M. Mirdamadi, S. Esmaeili, S. A. Tohidloo

Abstract:

The main purpose of this research was to analyze Socio-Cultural obstacles of disseminating of nanotechnology in Iran's agricultural section. One hundred twenty eight out of a total of 190 researchers with different levels of expertise in and familiarity with nanotechnology were randomly selected and questionnaires completed by them. Face validity have been done by expert's suggestion and correction, reliability by using Cronbakh-Alpha formula. The results of a factor analysis showed variation for different factors. For cultural factors 19/475 percent, for management 13/139 percent, information factor 11/277 percent, production factor 9/703 percent, social factor 9/267 percent, and for attitude factor it became 8/947 percent. Also results indicated that socio-cultural factors were the most important obstacle for nanotechnology dissemination in agricultural section in Iran.

Keywords: Agriculture, Iran, nanotechnology, public perception, social-cultural obstacles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
1376 Cloud Computing: Changing Cogitation about Computing

Authors: Mehrdad Mahdavi Boroujerdi, Soheil Nazem

Abstract:

Cloud Computing is a new technology that helps us to use the Cloud for compliance our computation needs. Cloud refers to a scalable network of computers that work together like Internet. An important element in Cloud Computing is that we shift processing, managing, storing and implementing our data from, locality into the Cloud; So it helps us to improve the efficiency. Because of it is new technology, it has both advantages and disadvantages that are scrutinized in this article. Then some vanguards of this technology are studied. Afterwards we find out that Cloud Computing will have important roles in our tomorrow life!

Keywords: Cloud Computing, Grid Computing, Internet as a Platform, On-demand Computing, Software as a Service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
1375 Production Planning and Scheduling and SME

Authors: M. Heck, H. Vettiger

Abstract:

Small and medium-sized enterprises (SME) are the backbone of central Europe’s economies and have a significant contribution to the gross domestic product. Production planning and scheduling (PPS) is still a crucial element in manufacturing industries of the 21st century even though this area of research is more than a century old. The topic of PPS is well researched especially in the context of large enterprises in the manufacturing industry. However the implementation of PPS methodologies within SME is mostly unobserved. This work analyzes how PPS is implemented in SME with the geographical focus on Switzerland and its vicinity. Based on restricted resources compared to large enterprises, SME have to face different challenges. The real problem areas of selected enterprises in regards of PPS are identified and evaluated. For the identified real-life problem areas of SME clear and detailed recommendations are created, covering concepts and best practices and the efficient usage of PPS. Furthermore the economic and entrepreneurial value for companies is lined out and why the implementation of the introduced recommendations is advised.

Keywords: Central Europe, PPS, Production Planning, SME.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2177
1374 Effect of Pre-Plasma Potential on Laser Ion Acceleration

Authors: Djemai Bara, Mohamed Faouzi Mahboub, Djamila Bennaceur-Doumaz

Abstract:

In this work, the role of the preformed plasma created on the front face of a target, irradiated by a high intensity short pulse laser, in the framework of ion acceleration process, modeled by Target Normal Sheath Acceleration (TNSA) mechanism, is studied. This plasma is composed of cold ions governed by fluid equations and non-thermal & trapped with densities represented by a "Cairns-Gurevich" equation. The self-similar solution of the equations shows that electronic trapping and the presence of non-thermal electrons in the pre-plasma are both responsible in ion acceleration as long as the proportion of energetic electrons is not too high. In the case where the majority of electrons are energetic, the electrons are accelerated directly by the ponderomotive force of the laser without the intermediate of an accelerating plasma wave.

Keywords: Cairns-Gurevich Equation, ion acceleration, plasma expansion, pre-plasma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 713
1373 The Use of Information Technologies in Special Education for Preparation of Individual Education Programs

Authors: Yasar Guneri Sahin, Mehmet Cudi Okur

Abstract:

In this presentation, we discuss the use of information technologies in the area of special education for teaching individuals with learning disabilities. Application software which was developed for this purpose is used to demonstrate the applicability of a database integrated information processing system to alleviate the burden of educators. The software allows the preparation of individualized education programs based on the predefined objectives, goals and behaviors.

Keywords: Special education, disabled individual, informationtechnology, individual education programs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
1372 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: Daily rainfall, Image processing, Approximation, Pixel value data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1756
1371 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory

Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock

Abstract:

Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.

Keywords: Subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 829
1370 64 bit Computer Architectures for Space Applications – A study

Authors: Niveditha Domse, Kris Kumar, K. N. Balasubramanya Murthy

Abstract:

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

Keywords: RISC MicroProcessor, RPC – RISC Processor Core, PBX – Processor to Block Interface part of the Interconnection Network, BPX – Block to Processor Interface part of the Interconnection Network, FPU – Floating Point Unit, SPU – Signal Processing Unit, WB – Wishbone Interface, CTU – Clock and Test Unit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247
1369 Mammogram Image Size Reduction Using 16-8 bit Conversion Technique

Authors: Ayman A. AbuBaker, Rami S.Qahwaji, Musbah J. Aqel, Mohmmad H. Saleh

Abstract:

Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.

Keywords: Breast cancer, Image processing, Image reduction, Mammograms, Image enhancement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
1368 Taiwan’s Democratic Institutions: The Electoral Rise and Recall of Kuomintang’s Han Kuo-yu Mayor

Authors: Ryan Brading

Abstract:

The results of Taiwan’s presidential election, which took place on 11 January 2020, were alarming for the Kuomintang (KMT). A party that was once the pillar of Taiwan’s institutional apparatus is now losing its direction. Since 2016, the inability of KMT to construct a winning presidential election campaign strategy has made its Chinese ancestry an obstacle in Taiwan’s vibrant and transparent democracy. The appearance of the little-known legislator Han Kuo-yu as the leadership alternative opened the possibility of reigniting the party. Han’s victory in the Kaohsiung mayoral election in November 2018 provided hope that Han could also win the presidency. Wrongly described as a populist, Han, however, was defeated in the January 2020 presidential race. This article analyses why Han is not a populist, his triumph in Kaohsiung, humiliation in running for the presidency and suffering a complete ‘loss of face’ when Kaohsiungers democratically ousted him from the mayoral post on 6 June 2020.

Keywords: Populism, ‘1992 Consensus’, Taiwan, youth vote, Han’s recall.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 389
1367 An Analysis of Classification of Imbalanced Datasets by Using Synthetic Minority Over-Sampling Technique

Authors: Ghada A. Alfattni

Abstract:

Analysing unbalanced datasets is one of the challenges that practitioners in machine learning field face. However, many researches have been carried out to determine the effectiveness of the use of the synthetic minority over-sampling technique (SMOTE) to address this issue. The aim of this study was therefore to compare the effectiveness of the SMOTE over different models on unbalanced datasets. Three classification models (Logistic Regression, Support Vector Machine and Nearest Neighbour) were tested with multiple datasets, then the same datasets were oversampled by using SMOTE and applied again to the three models to compare the differences in the performances. Results of experiments show that the highest number of nearest neighbours gives lower values of error rates. 

Keywords: Imbalanced datasets, SMOTE, machine learning, logistic regression, support vector machine, nearest neighbour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1313
1366 Deep-Learning Based Approach to Facial Emotion Recognition Through Convolutional Neural Network

Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah

Abstract:

Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. However, accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER benefiting from deep learning, especially CNN and VGG16. First, the data are pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.

Keywords: CNN, deep-learning, facial emotion recognition, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 708
1365 A Linearization and Decomposition Based Approach to Minimize the Non-Productive Time in Transfer Lines

Authors: Hany Osman, M. F. Baki

Abstract:

We address the balancing problem of transfer lines in this paper to find the optimal line balancing that minimizes the nonproductive time. We focus on the tool change time and face orientation change time both of which influence the makespane. We consider machine capacity limitations and technological constraints associated with the manufacturing process of auto cylinder heads. The problem is represented by a mixed integer programming model that aims at distributing the design features to workstations and sequencing the machining processes at a minimum non-productive time. The proposed model is solved by an algorithm established using linearization schemes and Benders- decomposition approach. The experiments show the efficiency of the algorithm in reaching the exact solution of small and medium problem instances at reasonable time.

Keywords: Transfer line balancing, Benders' decomposition, Linearization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
1364 Contextual SenSe Model: Word Sense Disambiguation Using Sense and Sense Value of Context Surrounding the Target

Authors: Vishal Raj, Noorhan Abbas

Abstract:

Ambiguity in NLP (Natural Language Processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a method to create an affinity matrix to calculate the affinity between any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an algorithm to create the sense clusters of tokens using affinity matrix under hierarchy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contextual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and challenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.

Keywords: Word Sense Disambiguation, WSD, Contextual SenSe Model, Most Frequent Sense, part of speech, POS, Natural Language Processing, NLP, OOV, out of vocabulary, ELMo, Embeddings from Language Model, BERT, Bidirectional Encoder Representations from Transformers, Word2Vec, lemma_POS, Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 380
1363 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 793
1362 Directing the Forensic Investigation of a Catastrophic Structure Collapse: The Jacksonville Parking Garage Collapse

Authors: W. C. Bracken

Abstract:

This paper discusses the forensic investigation of a fatality-involved catastrophic structure collapse and the special challenges faced when tasked with directing such an effort. While this paper discusses the investigation’s findings and the outcome of the event; this paper’s primary focus is on the challenges faced directing a forensic investigation that requires coordinating with governmental oversight while also having to accommodate multiple parties’ investigative teams. In particular the challenges discussed within this paper included maintaining on-site safety and operations while accommodating outside investigator’s interests. In addition this paper discusses unique challenges that one may face such as what to do about unethical conduct of interested party’s investigative teams, “off the record” sharing of information, and clandestinely transmitted evidence.

Keywords: Catastrophic structure collapse, collapse investigation, Jacksonville parking garage collapse, forensic investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2094
1361 Statistical Optimization of the Enzymatic Saccharification of the Oil Palm Empty Fruit Bunches

Authors: Rashid S. S., Alam M. Z.

Abstract:

A statistical optimization of the saccharification process of EFB was studied. The statistical analysis was done by applying faced centered central composite design (FCCCD) under response surface methodology (RSM). In this investigation, EFB dose, enzyme dose and saccharification period was examined, and the maximum 53.45% (w/w) yield of reducing sugar was found with 4% (w/v) of EFB, 10% (v/v) of enzyme after 120 hours of incubation. It can be calculated that the conversion rate of cellulose content of the substrate is more than 75% (w/w) which can be considered as a remarkable achievement. All the variables, linear, quadratic and interaction coefficient, were found to be highly significant, other than two coefficients, one quadratic and another interaction coefficient. The coefficient of determination (R2) is 0.9898 that confirms a satisfactory data and indicated that approximately 98.98% of the variability in the dependent variable, saccharification of EFB, could be explained by this model.

Keywords: Face centered central composite design (FCCCD), Liquid state bioconversion (LSB), Palm oil mill effluent, Trichoderma reesei RUT C-30.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2250
1360 Analysis of Complex Quadrature Mirror Filter Banks

Authors: Chimin Tsai

Abstract:

This work consists of three parts. First, the alias-free condition for the conventional two-channel quadrature mirror filter bank is analyzed using complex arithmetic. Second, the approach developed in the first part is applied to the complex quadrature mirror filter bank. Accordingly, the structure is simplified and the theory is easier to follow. Finally, a new class of complex quadrature mirror filter banks is proposed. Interesting properties of this new structure are also discussed.

Keywords: Aliasing cancellation, complex signal processing, polyphase realization, quadrature mirror filter banks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2274
1359 Revisiting Distributed Protocols for Mobility at the Application Layer

Authors: N. Nouali, H. Drias, A. Doucet

Abstract:

During more than a decade, many proposals and standards have been designed to deal with the mobility issues; however, there are still some serious limitations in basing solutions on them. In this paper we discuss the possibility of handling mobility at the application layer. We do this while revisiting the conventional implementation of the Two Phase Commit (2PC) protocol which is a fundamental asset of transactional technology for ensuring the consistent commitment of distributed transactions. The solution is based on an execution framework providing an efficient extension that is aware of the mobility and preserves the 2PC principle.

Keywords: Application layer, distributed mobile protocols, mobility management, mobile transaction processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
1358 En-Face Optical Coherence Tomography and Fluorescence in Evaluation of Orthodontic Interfaces

Authors: R. O. Rominu, C. Sinescu, D.M. Pop, M. Hughes, A. Bradu, M. Rominu, A. Gh. Podoleanu

Abstract:

Bonding has become a routine procedure in several dental specialties – from prosthodontics to conservative dentistry and even orthodontics. In many of these fields it is important to be able to investigate the bonded interfaces to assess their quality. All currently employed investigative methods are invasive, meaning that samples are destroyed in the testing procedure and cannot be used again. We have investigated the interface between human enamel and bonded ceramic brackets non-invasively, introducing a combination of new investigative methods – optical coherence tomography (OCT), fluorescence OCT and confocal microscopy (CM). Brackets were conventionally bonded on conditioned buccal surfaces of teeth. The bonding was assessed using these methods. Three dimensional reconstructions of the detected material defects were developed using manual and semi-automatic segmentation. The results clearly prove that OCT, fluorescence OCT and CM are useful in orthodontic bonding investigations.

Keywords: Optical coherence tomography, Confocal Microscopy, Orthodontic Bonding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
1357 Algorithm of Measurement of Noise Signal Power in the Presence of Narrowband Interference

Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev

Abstract:

A power measurement algorithm of the input mix components of the noise signal and narrowband interference is considered using functional transformations of the input mix in the postdetection processing channel. The algorithm efficiency analysis has been carried out for different interference-to-signal ratio. Algorithm performance features have been explored by numerical experiment results.

Keywords: Noise signal, continuous narrowband interference, signal power, spectrum width, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1396
1356 BPR Effect on ERP Implementation: a Comparative Case Study

Authors: Turan Erman Erkan

Abstract:

Business Process Reengineering (BPR) is an essential tool before an information system project implementation. Enterprise Resource Planning (ERP) projects definitely require the standardization and fixation of business processes from customer order to shipment. Therefore, ERP implementations are well proven to be coupled with BPR, although the extend and timing of BPR with respect to ERP implementation differ. This study aims at analyzing the effects of BPR on ERP implementation success. Basing on two Turkish ERP implementations in pharmaceutical sector, a comparative study is performed. One of the ERP implementations took place after a BPR implementation, whereas the other implementation was without a prior BPR application. Both implementations have been realized with the same consultant team, the case with prior BPR implementation going live first. The results of the case study reveal that if business processes are not optimized and improved before an ERP implementation, ERP live system would face with disharmony problems of processes and processes automated by ERP. This suggests a definite precedence relationship between BPR and ERP applications

Keywords: Business Process Reengineering, Enterprise Resource Planning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4841