Search results for: Analytic Network Process (ANP)
3418 The modeling of Brand Loyalty in the Brewing Market in Poland
Authors: Honorata Howaniec
Abstract:
Brand loyalty is a strategic asset of the company. In the era of competition to have loyal customers decides on the market superiority of enterprises. Creating the loyalty of buyers, however, is a lengthy process and requires the appropriate business strategy, preceded by the proper market research. The purpose of the paper is to present the concept of brand loyalty, the creation of loyalty of customers, the benefits and determinants of loyalty on the example of brewery market in Poland.Keywords: brand, brand loyalty, brewery market
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14743417 Comparison between Turbo Code and Convolutional Product Code (CPC) for Mobile WiMAX
Authors: Ahmed Ebian, Mona Shokair, Kamal Awadalla
Abstract:
Mobile WiMAX is a broadband wireless solution that enables convergence of mobile and fixed broadband networks through a common wide area broadband radio access technology and flexible network architecture. It adopts Orthogonal Frequency Division Multiple Access (OFDMA) for improved multi-path performance in Non-Line-Of-Sight (NLOS) environments. Scalable OFDMA (SOFDMA) is introduced in the IEEE 802e[1]. WIMAX system uses one of different types of channel coding but The mandatory channel coding scheme is based on binary nonrecursive Convolutional Coding (CC). There are other several optional channel coding schemes such as block turbo codes, convolutional turbo codes, and low density parity check (LDPC). In this paper a comparison between the performance of WIMAX using turbo code and using convolutional product code (CPC) [2] is made. Also a combination between them had been done. The CPC gives good results at different SNR values compared to both the turbo system, and the combination between them. For example, at BER equal to 10-2 for 128 subcarriers, the amount of improvement in SNR equals approximately 3 dB higher than turbo code and equals approximately 2dB higher than the combination respectively. Several results are obtained at different modulating schemes (16QAM and 64QAM) and different numbers of sub-carriers (128 and 512).Keywords: Turbo Code, Convolutional Product Code (CPC), Convolutional Product Code (CPC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33983416 An Anomaly Detection Approach to Detect Unexpected Faults in Recordings from Test Drives
Authors: Andreas Theissler, Ian Dear
Abstract:
In the automotive industry test drives are being conducted during the development of new vehicle models or as a part of quality assurance of series-production vehicles. The communication on the in-vehicle network, data from external sensors, or internal data from the electronic control units is recorded by automotive data loggers during the test drives. The recordings are used for fault analysis. Since the resulting data volume is tremendous, manually analysing each recording in great detail is not feasible. This paper proposes to use machine learning to support domainexperts by preventing them from contemplating irrelevant data and rather pointing them to the relevant parts in the recordings. The underlying idea is to learn the normal behaviour from available recordings, i.e. a training set, and then to autonomously detect unexpected deviations and report them as anomalies. The one-class support vector machine “support vector data description” is utilised to calculate distances of feature vectors. SVDDSUBSEQ is proposed as a novel approach, allowing to classify subsequences in multivariate time series data. The approach allows to detect unexpected faults without modelling effort as is shown with experimental results on recordings from test drives.
Keywords: Anomaly detection, fault detection, test drive analysis, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24773415 SPA-VNDN: Enhanced Smart Parking Application by Vehicular Named Data Networking
Authors: Bassma Aldahlan, Zongming Fei
Abstract:
Recently, there is a great interest in smart parking application. Theses applications are enhanced by a vehicular ad-hoc network, which helps drivers find and reserve satiable packing spaces for a period of time ahead of time. Named Data Networking (NDN) is a future Internet architecture that benefits vehicular ad-hoc networks because of its clean-slate design and pure communication model. In this paper, we proposed an NDN-based frame-work for smart parking that involved a fog computing architecture. The proposed application had two main directions: First, we allowed drivers to query the number of parking spaces in a particular parking lot. Second, we introduced a technique that enabled drivers to make intelligent reservations before their arrival time. We also introduced a “push-based” model supporting the NDN-based framework for smart parking applications. To evaluate the proposed solution’s performance, we analyzed the function for finding parking lots with available parking spaces and the function for reserving a parking space. Our system showed high performance results in terms of response time and push overhead. The proposed reservation application performed better than the baseline approach.
Keywords: Cloud Computing, Vehicular Named Data Networking, Smart Parking Applications, Fog Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2283414 Greek Compounds: A Challenging Case for the Parsing Techniques of PC-KIMMO v.2
Authors: Angela Ralli, Eleni Galiotou
Abstract:
In this paper we describe the recognition process of Greek compound words using the PC-KIMMO software. We try to show certain limitations of the system with respect to the principles of compound formation in Greek. Moreover, we discuss the computational processing of phenomena such as stress and syllabification which are indispensable for the analysis of such constructions and we try to propose linguistically-acceptable solutions within the particular system.
Keywords: Morpho-phonological parsing, compound words, two-level morphology, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16093413 Context Aware Anomaly Behavior Analysis for Smart Home Systems
Authors: Zhiwen Pan, Jesus Pacheco, Salim Hariri, Yiqiang Chen, Bozhi Liu
Abstract:
The Internet of Things (IoT) will lead to the development of advanced Smart Home services that are pervasive, cost-effective, and can be accessed by home occupants from anywhere and at any time. However, advanced smart home applications will introduce grand security challenges due to the increase in the attack surface. Current approaches do not handle cybersecurity from a holistic point of view; hence, a systematic cybersecurity mechanism needs to be adopted when designing smart home applications. In this paper, we present a generic intrusion detection methodology to detect and mitigate the anomaly behaviors happened in Smart Home Systems (SHS). By utilizing our Smart Home Context Data Structure, the heterogeneous information and services acquired from SHS are mapped in context attributes which can describe the context of smart home operation precisely and accurately. Runtime models for describing usage patterns of home assets are developed based on characterization functions. A threat-aware action management methodology, used to efficiently mitigate anomaly behaviors, is proposed at the end. Our preliminary experimental results show that our methodology can be used to detect and mitigate known and unknown threats, as well as to protect SHS premises and services.
Keywords: Internet of Things, network security, context awareness, intrusion detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12793412 Analysis of Linguistic Disfluencies in Bilingual Children’s Discourse
Authors: Sheena Christabel Pravin, M. Palanivelan
Abstract:
Speech disfluencies are common in spontaneous speech. The primary purpose of this study was to distinguish linguistic disfluencies from stuttering disfluencies in bilingual Tamil–English (TE) speaking children. The secondary purpose was to determine whether their disfluencies are mediated by native language dominance and/or on an early onset of developmental stuttering at childhood. A detailed study was carried out to identify the prosodic and acoustic features that uniquely represent the disfluent regions of speech. This paper focuses on statistical modeling of repetitions, prolongations, pauses and interjections in the speech corpus encompassing bilingual spontaneous utterances from school going children – English and Tamil. Two classifiers including Hidden Markov Models (HMM) and the Multilayer Perceptron (MLP), which is a class of feed-forward artificial neural network, were compared in the classification of disfluencies. The results of the classifiers document the patterns of disfluency in spontaneous speech samples of school-aged children to distinguish between Children Who Stutter (CWS) and Children with Language Impairment CLI). The ability of the models in classifying the disfluencies was measured in terms of F-measure, Recall, and Precision.
Keywords: Bilingual, children who stutter, children with language impairment, Hidden Markov Models, multi-layer perceptron, linguistic disfluencies, stuttering disfluencies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10293411 Geovisualization of Tourist Activity Travel Patterns Using 3D GIS: An Empirical Study of Tamsui, Taiwan
Authors: Meng-Lung Lin, Chien-Min Chu, Chung-Hung Tsai, Chih-Cheng Chen, Chen-Yuan Chen
Abstract:
The study of tourist activities and the mapping of their routes in space and time has become an important issue in tourism management. Here we represent space-time paths for the tourism industry by visualizing individual tourist activities and the paths followed using a 3D Geographic Information System (GIS). Considerable attention has been devoted to the measurement of accessibility to shopping, eating, walking and other services at the tourist destination. I turns out that GIS is a useful tool for studying the spatial behaviors of tourists in the area. The value of GIS is especially advantageous for space-time potential path area measures, especially for the accurate visualization of possible paths through existing city road networks. This study seeks to apply space-time concepts with a detailed street network map obtained from Google Maps to measure tourist paths both spatially and temporally. These paths are further determined based on data obtained from map questionnaires regarding the trip activities of 40 individuals. The analysis of the data makes it possible to determining the locations of the more popular paths. The results can be visualized using 3D GIS to show the areas and potential activity opportunities accessible to tourists during their travel time.
Keywords: Tourist activity analysis, space-time path, GIS, geovisualization, activity-travel pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24943410 High Securing Cover-File of Hidden Data Using Statistical Technique and AES Encryption Algorithm
Authors: A. A. Zaidan, Anas Majeed, B. B. Zaidan
Abstract:
Nowadays, the rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threatens. It-s a big security and privacy issue with the large flood of information and the development of the digital format, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. Nowadays protection system classified with more specific as hiding information, encryption information, and combination between hiding and encryption to increase information security, the strength of the information hiding science is due to the non-existence of standard algorithms to be used in hiding secret messages. Also there is randomness in hiding methods such as combining several media (covers) with different methods to pass a secret message. In addition, there are no formal methods to be followed to discover the hidden data. For this reason, the task of this research becomes difficult. In this paper, a new system of information hiding is presented. The proposed system aim to hidden information (data file) in any execution file (EXE) and to detect the hidden file and we will see implementation of steganography system which embeds information in an execution file. (EXE) files have been investigated. The system tries to find a solution to the size of the cover file and making it undetectable by anti-virus software. The system includes two main functions; first is the hiding of the information in a Portable Executable File (EXE), through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information). The system has achieved the main goals, such as make the relation of the size of the cover file and the size of information independent and the result file does not make any conflict with anti-virus software.Keywords: Cryptography, Steganography, Portable ExecutableFile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18033409 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling
Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo
Abstract:
Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16863408 Distributed Generator Placement for Loss Reduction and Improvement in Reliability
Authors: Priyanka Paliwal, N.P. Patidar
Abstract:
Distributed Power generation has gained a lot of attention in recent times due to constraints associated with conventional power generation and new advancements in DG technologies .The need to operate the power system economically and with optimum levels of reliability has further led to an increase in interest in Distributed Generation. However it is important to place Distributed Generator on an optimum location so that the purpose of loss minimization and voltage regulation is dully served on the feeder. This paper investigates the impact of DG units installation on electric losses, reliability and voltage profile of distribution networks. In this paper, our aim would be to find optimal distributed generation allocation for loss reduction subjected to constraint of voltage regulation in distribution network. The system is further analyzed for increased levels of Reliability. Distributed Generator offers the additional advantage of increase in reliability levels as suggested by the improvements in various reliability indices such as SAIDI, CAIDI and AENS. Comparative studies are performed and related results are addressed. An analytical technique is used in order to find the optimal location of Distributed Generator. The suggested technique is programmed under MATLAB software. The results clearly indicate that DG can reduce the electrical line loss while simultaneously improving the reliability of the system.Keywords: AENS, CAIDI, Distributed Generation, lossreduction, Reliability, SAIDI
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31023407 Development of Software Complex for Digitalization of Enterprise Activities
Authors: G. T. Balakayeva, K. K. Nurlybayeva, M. B. Zhanuzakov
Abstract:
In the proposed work, we have developed software and designed a software architecture for the implementation of enterprise business processes. The proposed software has a multi-level architecture using a domain-specific tool. The developed architecture is a guarantor of the availability, reliability and security of the system and the implementation of business processes, which are the basis for effective enterprise management. Automating business processes, automating the algorithmic stages of an enterprise, developing optimal algorithms for managing activities, controlling and monitoring, reducing risks and improving results help organizations achieve strategic goals quickly and efficiently. The software described in this article can connect to the corporate information system via two methods: a desktop client and a web client. With an appeal to the application server, the desktop client program connects to the information system on the company's work PCs over a local network. Outside the organization, the user can interact with the information system via a web browser, which acts as a web client and connects to a web server. The developed software consists of several integrated modules that share resources and interact with each other through an API. The following technology stack was used during development: Node js, React js, MongoDB, Ngnix, Cloud Technologies, Python.
Keywords: Algorithms, document processing, automation, integrated modules, software architecture, software design, information system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2073406 Secure Low-Bandwidth Video Streaming through Reliable Multipath Propagation in MANETs
Authors: S. Mohideen Badhusha, K. Duraiswamy
Abstract:
Most of the existing video streaming protocols provide video services without considering security aspects in decentralized mobile ad-hoc networks. The security policies adapted to the currently existing non-streaming protocols, do not comply with the live video streaming protocols resulting in considerable vulnerability, high bandwidth consumption and unreliability which cause severe security threats, low bandwidth and error prone transmission respectively in video streaming applications. Therefore a synergized methodology is required to reduce vulnerability and bandwidth consumption, and enhance reliability in the video streaming applications in MANET. To ensure the security measures with reduced bandwidth consumption and improve reliability of the video streaming applications, a Secure Low-bandwidth Video Streaming through Reliable Multipath Propagation (SLVRMP) protocol architecture has been proposed by incorporating the two algorithms namely Secure Low-bandwidth Video Streaming Algorithm and Reliable Secure Multipath Propagation Algorithm using Layered Video Coding in non-overlapping zone routing network topology. The performances of the proposed system are compared to those of the other existing secure multipath protocols Sec-MR, SPREAD using NS 2.34 and the simulation results show that the performances of the proposed system get considerably improved.Keywords: Bandwidth consumption, layered video coding, multipath propagation, reliability, security threats, video streaming applications, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18833405 Integrating AI Visualization Tools to Enhance Student Engagement and Understanding in AI Education
Authors: Yong W. Foo, Lai M. Tang
Abstract:
Artificial Intelligence (AI), particularly the usage of deep neural networks for hierarchical representations from data, has found numerous complex applications across various domains, including computer vision, robotics, autonomous vehicles, and other scientific fields. However, their inherent “black box” nature can sometimes make it challenging for early researchers or school students of various levels to comprehend and trust the results they produce. Consequently, there has been a growing demand for reliable visualization tools in engineering and science education to help learners understand, trust, and explain a deep learning network. This has led to a notable emphasis on the visualization of AI in the research community in recent years. AI visualization tools are increasingly being adopted to significantly improve the comprehension of complex topics in deep learning. This paper presents an approach to empower students to actively explore the inner workings of deep neural networks by integrating the student-centered learning approach of flipped classroom models with the investigative capabilities of AI visualization tools, namely, the TensorFlow Playground, the Local Interpretable Model-agnostic Explanations (LIME), and the SHapley Additive exPlanations (SHAP), for delivering an AI education curriculum. Integrating these two factors is crucial for fostering ownership, responsibility, and critical thinking skills in the age of AI.
Keywords: Deep Learning, Explainable AI, AI Visualization, Representation Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 333404 Assessment of Wastewater Reuse Potential for an Enamel Coating Industry
Authors: Guclu Insel, Efe Gumuslu, Gulten Yuksek, Nilay Sayi Ucar, Emine Ubay Cokgor, Tugba Olmez Hanci, Didem Okutman Tas, Fatos Germirli Babuna, Derya Firat Ertem, Okmen Yildirim, Ozge Erturan, Betul Kirci
Abstract:
In order to eliminate water scarcity problems, effective precautions must be taken. Growing competition for water is increasingly forcing facilities to tackle their own water scarcity problems. At this point, application of wastewater reclamation and reuse results in considerable economic advantageous. In this study, an enamel coating facility, which is one of the high water consumed facilities, is evaluated in terms of its wastewater reuse potential. Wastewater reclamation and reuse can be defined as one of the best available techniques for this sector. Hence, process and pollution profiles together with detailed characterization of segregated wastewater sources are appraised in a way to find out the recoverable effluent streams arising from enamel coating operations. Daily, 170 m3 of process water is required and 160 m3 of wastewater is generated. The segregated streams generated by two enamel coating processes are characterized in terms of conventional parameters. Relatively clean segregated wastewater streams (reusable wastewaters) are separately collected and experimental treatability studies are conducted on it. The results reflected that the reusable wastewater fraction has an approximate amount of 110 m3/day that accounts for 68% of the total wastewaters. The need for treatment applicable on reusable wastewaters is determined by considering water quality requirements of various operations and characterization of reusable wastewater streams. Ultra-filtration (UF), Nano-filtration (NF) and Reverse Osmosis (RO) membranes are subsequently applied on reusable effluent fraction. Adequate organic matter removal is not obtained with the mentioned treatment sequence.Keywords: enamel coating, membrane, reuse, wastewater
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14903403 Rheological Properties of Dough and Sensory Quality of Crackers with Dietary Fibers
Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Zita Šereš, Biljana Pajin, Nils Juul, Nikola Maravić
Abstract:
The possibility of application the dietary fibers in production of crackers was observed in this work, as well as their influence on rheological and textural properties on the dough for crackers and influence on sensory properties of obtained crackers. Three different dietary fibers, oat, potato and pea fibers, replaced 10% of wheat flour. Long fermentation process and baking test method were used for crackers production. The changes of dough for crackers were observed by rheological methods of determination the viscoelastic dough properties and by textural measurements. Sensory quality of obtained crackers was described using quantity descriptive method (QDA) by trained members of descriptive panel. Additional analysis of crackers surface was performed by videometer. Based on rheological determination, viscoelastic properties of dough for crackers were reduced by application of dietary fibers. Manipulation of dough with 10% of potato fiber was disabled, thus the recipe modification included increase in water content at 35%. Dough compliance to constant stress for samples with dietary fibers decreased, due to more rigid and stiffer dough consistency compared to control sample. Also, hardness of dough for these samples increased and dough extensibility decreased. Sensory properties of final products, crackers, were reduced compared to control sample. Application of dietary fibers affected mostly hardness, structure and crispness of the crackers. Observed crackers were low marked for flavor and taste, due to influence of fibers specific aroma. The sample with 10% of potato fibers and increased water content was the most adaptable to applied stresses and to production process. Also this sample was close to control sample without dietary fibers by evaluation of sensory properties and by results of videometer method.Keywords: Crackers, dietary fibers, rheology, sensory properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24953402 Success Factors of Large Scale ERP Implementation in Thailand
Authors: Rotchanakitumnuai, Siriluck
Abstract:
The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.Keywords: large scale ERP, implementation success factors, Thailand
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31973401 Event Information Extraction System (EIEE): FSM vs HMM
Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani
Abstract:
Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23183400 Utilizing Biological Models to Determine the Recruitment of the Irish Republican Army
Authors: Erika Ann Schaub, Christian J Darken
Abstract:
Sociological models (e.g., social network analysis, small-group dynamic and gang models) have historically been used to predict the behavior of terrorist groups. However, they may not be the most appropriate method for understanding the behavior of terrorist organizations because the models were not initially intended to incorporate violent behavior of its subjects. Rather, models that incorporate life and death competition between subjects, i.e., models utilized by scientists to examine the behavior of wildlife populations, may provide a more accurate analysis. This paper suggests the use of biological models to attain a more robust method for understanding the behavior of terrorist organizations as compared to traditional methods. This study also describes how a biological population model incorporating predator-prey behavior factors can predict terrorist organizational recruitment behavior for the purpose of understanding the factors that govern the growth and decline of terrorist organizations. The Lotka-Volterra, a biological model that is based on a predator-prey relationship, is applied to a highly suggestive case study, that of the Irish Republican Army. This case study illuminates how a biological model can be utilized to understand the actions of a terrorist organization.
Keywords: Biological Models, Lotka-Volterra Predator-Prey Model, Terrorist Organizational Behavior, Terrorist Recruitment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15253399 Reading against the Grain: Transcodifying Stimulus Meaning
Authors: Aba-Carina Pârlog
Abstract:
The paper shows that on transferring sense from the SL to the TL, the translator’s reading against the grain determines the creation of a faulty pattern of rendering the original meaning in the receiving culture which reflects the use of misleading transformative codes. In this case, the translator is a writer per se who decides what goes in and out of the book, how the style is to be ciphered and what elements of ideology are to be highlighted. The paper also proves that figurative language must not be flattened for the sake of clarity or naturalness. The missing figurative elements make the translated text less interesting, less challenging and less vivid which reflects poorly on the writer. There is a close connection between style and the writer’s person. If the writer’s style is very much altered in a translation, the translation is useless as the original writer and his / her imaginative world can no longer be discovered. The purpose of the paper is to prove that adaptation is a dangerous tool which leads to variants that sometimes reflect the original less than the reader would wish to. It contradicts the very essence of the process of translation which is that of making an original work available in a foreign language. If the adaptive transformative codes are so flexible that they encourage the translator to repeatedly leave out parts of the original work, then a subversive pattern emerges which changes the entire book. In conclusion, as a result of using adaptation, manipulative or subversive effects are created in the translated work. This is generally achieved by adding new words or connotations, creating new figures of speech or using explicitations. The additional meanings of the original work are neglected and the translator creates new meanings, implications, emphases and contexts. Again s/he turns into a new author who enjoys the freedom of expressing his / her own ideas without the constraints of the original text. Reading against the grain is unadvisable during the process of translation and consequently, following personal common sense becomes essential in the field of translation as well as everywhere else, so that translation should not become a source of fantasy.Keywords: Speculative aesthetics, substance of expression, transformative code, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16533398 JaCoText: A Pretrained Model for Java Code-Text Generation
Authors: Jessica Lòpez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri
Abstract:
Pretrained transformer-based models have shown high performance in natural language generation task. However, a new wave of interest has surged: automatic programming language generation. This task consists of translating natural language instructions to a programming code. Despite the fact that well-known pretrained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformers neural network. It aims to generate java source code from natural language text. JaCoText leverages advantages of both natural language and code generation models. More specifically, we study some findings from the state of the art and use them to (1) initialize our model from powerful pretrained models, (2) explore additional pretraining on our java dataset, (3) carry out experiments combining the unimodal and bimodal data in the training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.
Keywords: Java code generation, Natural Language Processing, Sequence-to-sequence Models, Transformers Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8573397 Evolutionary Multi-objective Optimization for Positioning of Residential Houses
Authors: Ayman El Ansary, Mohamed Shalaby
Abstract:
The current study describes a multi-objective optimization technique for positioning of houses in a residential neighborhood. The main task is the placement of residential houses in a favorable configuration satisfying a number of objectives. Solving the house layout problem is a challenging task. It requires an iterative approach to satisfy design requirements (e.g. energy efficiency, skyview, daylight, roads network, visual privacy, and clear access to favorite views). These design requirements vary from one project to another based on location and client preferences. In the Gulf region, the most important socio-cultural factor is the visual privacy in indoor space. Hence, most of the residential houses in this region are surrounded by high fences to provide privacy, which has a direct impact on other requirements (e.g. daylight and direction to favorite views). This investigation introduces a novel technique to optimally locate and orient residential buildings to satisfy a set of design requirements. The developed technique explores the search space for possible solutions. This study considers two dimensional house planning problems. However, it can be extended to solve three dimensional cases.
Keywords: Evolutionary optimization, Houses planning, Urban modeling, Daylight, Visual Privacy, Residential compounds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15453396 Microseismicity of the Tehran Region Based on Three Seismic Networks
Authors: Jamileh Vasheghani Farahani
Abstract:
The main purpose of this research is to show the current active faults and active tectonic of the area by three seismic networks in Tehran region: 1-Tehran Disaster Mitigation and Management Organization (TDMMO), 2-Broadband Iranian National Seismic Network Center (BIN), 3-Iranian Seismological Center (IRSC). In this study, we analyzed microearthquakes happened in Tehran city and its surroundings using the Tehran networks from 1996 to 2015. We found some active faults and trends in the region. There is a 200-year history of historical earthquakes in Tehran. Historical and instrumental seismicity show that the east of Tehran is more active than the west. The Mosha fault in the North of Tehran is one of the active faults of the central Alborz. Moreover, other major faults in the region are Kahrizak, Eyvanakey, Parchin and North Tehran faults. An important seismicity region is an intersection of the Mosha and North Tehran fault systems (Kalan village in Lavasan). This region shows a cluster of microearthquakes. According to the historical and microseismic events analyzed in this research, there is a seismic gap in SE of Tehran. The empirical relationship is used to assess the Mmax based on the rupture length. There is a probability of occurrence of a strong motion of 7.0 to 7.5 magnitudes in the region (based on the assessed capability of the major faults such as Parchin and Eyvanekey faults and historical earthquakes).
Keywords: Iran, major faults, microseismicity, Tehran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15183395 ANN Based Currency Recognition System using Compressed Gray Scale and Application for Sri Lankan Currency Notes - SLCRec
Authors: D. A. K. S. Gunaratna, N. D. Kodikara, H. L. Premaratne
Abstract:
Automatic currency note recognition invariably depends on the currency note characteristics of a particular country and the extraction of features directly affects the recognition ability. Sri Lanka has not been involved in any kind of research or implementation of this kind. The proposed system “SLCRec" comes up with a solution focusing on minimizing false rejection of notes. Sri Lankan currency notes undergo severe changes in image quality in usage. Hence a special linear transformation function is adapted to wipe out noise patterns from backgrounds without affecting the notes- characteristic images and re-appear images of interest. The transformation maps the original gray scale range into a smaller range of 0 to 125. Applying Edge detection after the transformation provided better robustness for noise and fair representation of edges for new and old damaged notes. A three layer back propagation neural network is presented with the number of edges detected in row order of the notes and classification is accepted in four classes of interest which are 100, 500, 1000 and 2000 rupee notes. The experiments showed good classification results and proved that the proposed methodology has the capability of separating classes properly in varying image conditions.Keywords: Artificial intelligence, linear transformation and pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28343394 TOSOM: A Topic-Oriented Self-Organizing Map for Text Organization
Authors: Hsin-Chang Yang, Chung-Hong Lee, Kuo-Lung Ke
Abstract:
The self-organizing map (SOM) model is a well-known neural network model with wide spread of applications. The main characteristics of SOM are two-fold, namely dimension reduction and topology preservation. Using SOM, a high-dimensional data space will be mapped to some low-dimensional space. Meanwhile, the topological relations among data will be preserved. With such characteristics, the SOM was usually applied on data clustering and visualization tasks. However, the SOM has main disadvantage of the need to know the number and structure of neurons prior to training, which are difficult to be determined. Several schemes have been proposed to tackle such deficiency. Examples are growing/expandable SOM, hierarchical SOM, and growing hierarchical SOM. These schemes could dynamically expand the map, even generate hierarchical maps, during training. Encouraging results were reported. Basically, these schemes adapt the size and structure of the map according to the distribution of training data. That is, they are data-driven or dataoriented SOM schemes. In this work, a topic-oriented SOM scheme which is suitable for document clustering and organization will be developed. The proposed SOM will automatically adapt the number as well as the structure of the map according to identified topics. Unlike other data-oriented SOMs, our approach expands the map and generates the hierarchies both according to the topics and their characteristics of the neurons. The preliminary experiments give promising result and demonstrate the plausibility of the method.
Keywords: Self-organizing map, topic identification, learning algorithm, text clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20263393 Feasibility Investigation of Near Infrared Spectrometry for Particle Size Estimation of Nano Structures
Authors: A. Bagheri Garmarudi, M. Khanmohammadi, N. Khoddami, K. Shabani
Abstract:
Determination of nano particle size is substantial since the nano particle size exerts a significant effect on various properties of nano materials. Accordingly, proposing non-destructive, accurate and rapid techniques for this aim is of high interest. There are some conventional techniques to investigate the morphology and grain size of nano particles such as scanning electron microscopy (SEM), atomic force microscopy (AFM) and X-ray diffractometry (XRD). Vibrational spectroscopy is utilized to characterize different compounds and applied for evaluation of the average particle size based on relationship between particle size and near infrared spectra [1,4] , but it has never been applied in quantitative morphological analysis of nano materials. So far, the potential application of nearinfrared (NIR) spectroscopy with its ability in rapid analysis of powdered materials with minimal sample preparation, has been suggested for particle size determination of powdered pharmaceuticals. The relationship between particle size and diffuse reflectance (DR) spectra in near infrared region has been applied to introduce a method for estimation of particle size. Back propagation artificial neural network (BP-ANN) as a nonlinear model was applied to estimate average particle size based on near infrared diffuse reflectance spectra. Thirty five different nano TiO2 samples with different particle size were analyzed by DR-FTNIR spectrometry and the obtained data were processed by BP- ANN.Keywords: near infrared, particle size, chemometrics, neuralnetwork, nano structure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18423392 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru
Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar
Abstract:
Nowadays, Heritage Building Information Modeling (HBIM) is considered an efficient tool to represent and manage information of Cultural Heritage (CH). The basis of this tool relies on a 3D model generally obtained from a Cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired Level of Development (LOD), Level of Information (LOI), Grade of Generation (GOG) as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models’ families respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources, since the BIM software used has a free student license.
Keywords: Cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9283391 Oil Prices Impact on Energy Policy of Kazakhstan
Authors: K. Gabdullin, Y. Bek Ali, N. Aldabek
Abstract:
This paper explores oil prices changes impact on energy policy of Kazakhstan in 2001-2009. It involves the role of oil income to the economic development, process of diversification of internal and external energy policy of Kazakhstan, and the changes in oil law towards subsoil users.
Keywords: diversification, internal energy policy, external energy policy, high oil prices, modernization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25433390 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling
Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo
Abstract:
Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18103389 Integrated Waste-to-Energy Approach: An Overview
Authors: Tsietsi J. Pilusa, Tumisang G. Seodigeng
Abstract:
This study evaluates the benefits of advanced waste management practices in unlocking waste-to-energy opportunities within the solid waste industry. The key drivers of sustainable waste management practices, specifically with respect to packaging waste-to-energy technology options are discussed. The success of a waste-to-energy system depends significantly on the appropriateness of available technologies, including those that are well established as well as those that are less so. There are hard and soft interventions to be considered when packaging an integrated waste treatment solution. Technology compatibility with variation in feedstock (waste) quality and quantities remains a key factor. These factors influence the technology reliability in terms of production efficiencies and product consistency, which in turn, drives the supply and demand network. Waste treatment technologies rely on the waste material as feedstock; the feedstock varies in quality and quantities depending on several factors; hence, the technology fails, as a result. It is critical to design an advanced waste treatment technology in an integrated approach to minimize the possibility of technology failure due to unpredictable feedstock quality, quantities, conversion efficiencies, and inconsistent product yield or quality. An integrated waste-to-energy approach offers a secure system design that considers sustainable waste management practices.
Keywords: Emerging markets, evaluation tool, interventions, waste treatment technologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1009