Search results for: object oriented analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28802

Search results for: object oriented analysis

27932 Entrepreneurship Education Revised: Merging a Theory-Based and Action-Based Framework for Entrepreneurial Narratives' Impact as an Awareness-Raising Teaching Tool

Authors: Katharina Fellnhofer, Kaisu Puumalainen

Abstract:

Despite the current worldwide increasing interest in entrepreneurship education (EE), little attention has been paid to innovative web-based ways such as the narrative approach by telling individual stories of entrepreneurs via multimedia for demonstrating the impact on individuals towards entrepreneurship. In addition, this research discipline is faced with no consensus regarding its effective content of teaching materials and tools. Therefore, a qualitative hypothesis-generating research contribution is required to aim at drawing new insights from published works in the EE field of research to serve for future research related to multimedia entrepreneurial narratives. Based on this background, our effort will focus on finding support regarding following introductory statement: Multimedia success and failure stories of real entrepreneurs show potential to change perceptions towards entrepreneurship in a positive way. The proposed qualitative conceptual paper will introduce the underlying background for this research framework. Therefore, as a qualitative hypothesis-generating research contribution it aims at drawing new insights from published works in the EE field of research related to entrepreneurial narratives to serve for future research. With the means of the triangulation of multiple theories, we will utilize the foundation for multimedia-based entrepreneurial narratives applying a learning-through-multimedia-real-entrepreneurial-narratives pedagogical tool to facilitate entrepreneurship. Our effort will help to demystify how value-oriented entrepreneurs telling their stories multimedia can simultaneously enhance EE. Therefore, the paper will build new-fangled bridges between well-cited theoretical constructs to build a robust research framework. Overall, the intended contribution seeks to emphasize future research of currently under-researched issues in the EE sphere, which are considered to be essential not only to academia, as well as to business and society having future jobs-providing growth-oriented entrepreneurs in mind. The Authors would like to thank the Austrian Science Fund FWF: [J3740 – G27].

Keywords: entrepreneurship education, entrepreneurial attitudes and perceptions, entrepreneurial intention, entrepreneurial narratives

Procedia PDF Downloads 245
27931 Education of Purchasing Professionals in Austria: Competence Based View

Authors: Volker Koch

Abstract:

This paper deals with the education of purchasing professionals in Austria. In this education, equivalent and measurable criteria are collected in order to create a comparison. The comparison shows the problem. To make the aforementioned comparison possible, methodologies such as KODE-Competence Atlas or presentations in a matrix form are used. The result shows the content taught and whether there are any similarities or interesting differences in the current Austrian purchasers’ formations. Purchasing professionals learning competencies are also illustrated in the study result.

Keywords: competencies, education, purchasing professional, technological-oriented

Procedia PDF Downloads 287
27930 Effective Affordable Housing Finance in Developing Economies: An Integration of Demand and Supply Solutions

Authors: Timothy Akinwande, Eddie Hui, Karien Dekker

Abstract:

Housing the urban poor remains a persistent challenge, despite evident research attention over many years. It is, therefore, pertinent to investigate affordable housing provision challenges with novel approaches. For innovative solutions to affordable housing constraints, it is apposite to thoroughly examine housing solutions vis a vis the key elements of the housing supply value chain (HSVC), which are housing finance, housing construction and land acquisition. A pragmatic analysis will examine affordable housing solutions from demand and supply perspectives to arrive at consolidated solutions from bilateral viewpoints. This study thoroughly examined informal housing finance strategies of the urban poor and diligently investigated expert opinion on affordable housing finance solutions. The research questions were: (1) What mutual grounds exist between informal housing finance solutions of the urban poor and housing expert solutions to affordable housing finance constraints in developing economies? (2) What are effective approaches to affordable housing finance in developing economies from an integrated demand - supply perspective? Semi-structured interviews were conducted in the 5 largest slums of Lagos, Nigeria, with 40 informal settlers for demand-oriented solutions, while focus group discussion and in-depth interviews were conducted with 12 housing experts in Nigeria for supply-oriented solutions. Following a rigorous thematic, content and descriptive analyses of data using NVivo and Excel, findings ascertained mutual solutions from both demand and supply standpoints that can be consolidated into more effective affordable housing finance solutions in Nigeria. Deliberate finance models that recognise and include the finance realities of the urban poor was found to be the most significant supply-side housing finance solution, representing 25.4% of total expert responses. Findings also show that 100% of sampled urban poor engage in vocations where they earn little irregular income or zero income, limiting their housing finance capacities and creditworthiness. Survey revealed that the urban poor are involved in community savings and employ microfinance institutions within the informal settlements to tackle their housing finance predicaments. These are informal finance models of the urban poor, revealing common grounds between demand and supply solutions for affordable housing financing. Effective, affordable housing approach will be to modify, institutionalise and incorporate the informal finance strategies of the urban poor into deliberate government policies. This consolidation of solutions from demand and supply perspectives can eliminate the persistent misalliance between affordable housing demand and affordable housing supply. This study provides insights into mutual housing solutions from demand and supply perspectives, and findings are informative for effective, affordable housing provision approaches in developing countries. This study is novel in consolidating affordable housing solutions from demand and supply viewpoints, especially in relation to housing finance as a key component of HSVC. The framework for effective, affordable housing finance in developing economies from a consolidated viewpoint generated in this study is significant for the achievement of sustainable development goals, especially goal 11 for sustainable, resilient and inclusive cities. Findings are vital for future housing studies.

Keywords: affordable housing, affordable housing finance, developing economies, effective affordable housing, housing policy, urban poor, sustainable development goal, sustainable affordable housing

Procedia PDF Downloads 62
27929 Optical Vortex in Asymmetric Arcs of Rotating Intensity

Authors: Mona Mihailescu, Rebeca Tudor, Irina A. Paun, Cristian Kusko, Eugen I. Scarlat, Mihai Kusko

Abstract:

Specific intensity distributions in the laser beams are required in many fields: optical communications, material processing, microscopy, optical tweezers. In optical communications, the information embedded in specific beams and the superposition of multiple beams can be used to increase the capacity of the communication channels, employing spatial modulation as an additional degree of freedom, besides already available polarization and wavelength multiplexing. In this regard, optical vortices present interest due to their potential to carry independent data which can be multiplexed at the transmitter and demultiplexed at the receiver. Also, in the literature were studied their combinations: 1) axial or perpendicular superposition of multiple optical vortices or 2) with other laser beam types: Bessel, Airy. Optical vortices, characterized by stationary ring-shape intensity and rotating phase, are achieved using computer generated holograms (CGH) obtained by simulating the interference between a tilted plane wave and a wave passing through a helical phase object. Here, we propose a method to combine information through the reunion of two CGHs. One is obtained using the helical phase distribution, characterized by its topological charge, m. The other is obtained using conical phase distribution, characterized by its radial factor, r0. Each CGH is obtained using plane wave with different tilts: km and kr for CGH generated from helical phase object and from conical phase object, respectively. These reunions of two CGHs are calculated to be phase optical elements, addressed on the liquid crystal display of a spatial light modulator, to optically process the incident beam for investigations of the diffracted intensity pattern in far field. For parallel reunion of two CGHs and high values of the ratio between km and kr, the bright ring from the first diffraction order, specific for optical vortices, is changed in an asymmetric intensity pattern: a number of circle arcs. Both diffraction orders (+1 and -1) are asymmetrical relative to each other. In different planes along the optical axis, it is observed that this asymmetric intensity pattern rotates around its centre: in the +1 diffraction order the rotation is anticlockwise and in the -1 diffraction order, the rotation is clockwise. The relation between m and r0 controls the diameter of the circle arcs and the ratio between km and kr controls the number of arcs. For perpendicular reunion of the two CGHs and low values of the ratio between km and kr, the optical vortices are multiplied and focalized in different planes, depending on the radial parameter. The first diffraction order contains information about both phase objects. It is incident on the phase masks placed at the receiver, computed using the opposite values for topological charge or for the radial parameter and displayed successively. In all, the proposed method is exploited in terms of constructive parameters, for the possibility offered by the combination of different types of beams which can be used in robust optical communications.

Keywords: asymmetrical diffraction orders, computer generated holograms, conical phase distribution, optical vortices, spatial light modulator

Procedia PDF Downloads 302
27928 Hand Gesture Detection via EmguCV Canny Pruning

Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae

Abstract:

Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.

Keywords: canny pruning, hand recognition, machine learning, skin tracking

Procedia PDF Downloads 175
27927 Food Design as a University-Industry Collaboration Project: An Experience Design on Controlling Chocolate Consumption and Long-Term Eating Behavior

Authors: Büşra Durmaz, Füsun Curaoğlu

Abstract:

While technology-oriented developments in the modern world change our perceptions of time and speed, they also force our food consumption patterns, such as getting pleasure from what we eat and eating slowly. The habit of eating quickly and hastily causes not only the feeling of not understanding the taste of the food eaten but also the inability to postpone the feeling of satiety and, therefore, many health problems. In this context, especially in the last ten years, in the field of industrial design, food manufacturers for healthy living and consumption have been collaborating with industrial designers on food design. The consumers of the new century, who are in an uncontrolled time intensity, receive support from small snacks as a source of happiness and pleasure in the little time intervals they can spare. At this point, especially chocolate has been a source of happiness for its consumers as a source of both happiness and pleasure for hundreds of years. However, when the portions have eaten cannot be controlled, a pleasure food such as chocolate can cause both health problems and many emotional problems, especially the feeling of guilt. Fast food, which is called food that is prepared and consumed quickly, has been increasing rapidly around the world in recent years. This study covers the process and results of a chocolate design based on the user experience of a university-industry cooperation project carried out within the scope of Eskişehir Technical University graduation projects. The aim of the project is a creative product design that will enable the user to experience chocolate consumption with a healthy eating approach. For this, while concepts such as pleasure, satiety, and taste are discussed; A survey with 151 people and semi-structured face-to-face interviews with 7 people during the experience design process within the scope of the user-oriented design approach, mainly literature review, within the scope of main topics such as mouth anatomy, tongue structure, taste, the functions of the eating action in the brain, hormones and chocolate, video A case study based on the research paradigm of Qualitative Research was structured within the scope of different research processes such as analysis and project diaries. As a result of the research, it has been reached that the melting in the mouth is the preferred experience of the users in order to spread the experience of eating chocolate for a long time based on pleasure while eating chocolate with healthy portions. In this context, researches about the production of sketches, mock-ups and prototypes of the product are included in the study. As a result, a product packaging design has been made that supports the active role of the senses such as sight, smell and hearing, where consumption begins, in order to consume chocolate by melting and to actively secrete the most important stimulus salivary glands in order to provide a healthy and long-term pleasure-based consumption.

Keywords: chocolate, eating habit, pleasure, saturation, sense of taste

Procedia PDF Downloads 71
27926 Present State of Local Public Transportation Service in Local Municipalities of Japan and Its Effects on Population

Authors: Akiko Kondo, Akio Kondo

Abstract:

We are facing regional problems to low birth rate and longevity in Japan. Under this situation, there are some local municipalities which lose their vitality. The aims of this study are to clarify the present state of local public transportation services in local municipalities and relation between local public transportation services and population quantitatively. We conducted a questionnaire survey concerning regional agenda in all local municipalities in Japan. We obtained responses concerning the present state of convenience in use of public transportation and local public transportation services. Based on the data gathered from the survey, it is apparent that we should some sort of measures concerning public transportation services. Convenience in use of public transportation becomes an object of public concern in many rural regions. It is also clarified that some local municipalities introduce a demand bus for the purpose of promotion of administrative and financial efficiency. They also introduce a demand taxi in order to secure transportation to weak people in transportation and eliminate of blank area related to public transportation services. In addition, we construct a population model which includes explanatory variables of present states of local public transportation services. From this result, we can clarify the relation between public transportation services and population quantitatively.

Keywords: public transportation, local municipality, regional analysis, regional issue

Procedia PDF Downloads 391
27925 FPGA Based Vector Control of PM Motor Using Sliding Mode Observer

Authors: Hanan Mikhael Dawood, Afaneen Anwer Abood Al-Khazraji

Abstract:

The paper presents an investigation of field oriented control strategy of Permanent Magnet Synchronous Motor (PMSM) based on hardware in the loop simulation (HIL) over a wide speed range. A sensorless rotor position estimation using sliding mode observer for permanent magnet synchronous motor is illustrated considering the effects of magnetic saturation between the d and q axes. The cross saturation between d and q axes has been calculated by finite-element analysis. Therefore, the inductance measurement regards the saturation and cross saturation which are used to obtain the suitable id-characteristics in base and flux weakening regions. Real time matrix multiplication in Field Programmable Gate Array (FPGA) using floating point number system is used utilizing Quartus-II environment to develop FPGA designs and then download these designs files into development kit. dSPACE DS1103 is utilized for Pulse Width Modulation (PWM) switching and the controller. The hardware in the loop results conducted to that from the Matlab simulation. Various dynamic conditions have been investigated.

Keywords: magnetic saturation, rotor position estimation, sliding mode observer, hardware in the loop (HIL)

Procedia PDF Downloads 518
27924 Content Based Video Retrieval System Using Principal Object Analysis

Authors: Van Thinh Bui, Anh Tuan Tran, Quoc Viet Ngo, The Bao Pham

Abstract:

Video retrieval is a searching problem on videos or clips based on content in which they are relatively close to an input image or video. The application of this retrieval consists of selecting video in a folder or recognizing a human in security camera. However, some recent approaches have been in challenging problem due to the diversity of video types, frame transitions and camera positions. Besides, that an appropriate measures is selected for the problem is a question. In order to overcome all obstacles, we propose a content-based video retrieval system in some main steps resulting in a good performance. From a main video, we process extracting keyframes and principal objects using Segmentation of Aggregating Superpixels (SAS) algorithm. After that, Speeded Up Robust Features (SURF) are selected from those principal objects. Then, the model “Bag-of-words” in accompanied by SVM classification are applied to obtain the retrieval result. Our system is performed on over 300 videos in diversity from music, history, movie, sports, and natural scene to TV program show. The performance is evaluated in promising comparison to the other approaches.

Keywords: video retrieval, principal objects, keyframe, segmentation of aggregating superpixels, speeded up robust features, bag-of-words, SVM

Procedia PDF Downloads 288
27923 Understanding Tacit Knowledge and Its Role in Military Organizations: Methods of Managing Tacit Knowledge

Authors: M. Erhan Orhan, Onur Ozdemir

Abstract:

Expansion of area of operation and increasing diversity of threats forced the military organizations to change in many ways. However, tacit knowledge still is the most fundamental component of organizational knowledge. Since it is human oriented and in warfare human stands at the core of the organization. Therefore, military organizations should find effective ways of systematically utilizing tacit knowledge. In this context, this article suggest some methods for turning tacit knowledge into explicit in military organizations.

Keywords: tacit knowledge, military, knowledge management, warfare, technology

Procedia PDF Downloads 475
27922 Electrophoretic Light Scattering Based on Total Internal Reflection as a Promising Diagnostic Method

Authors: Ekaterina A. Savchenko, Elena N. Velichko, Evgenii T. Aksenov

Abstract:

The development of pathological processes, such as cardiovascular and oncological diseases, are accompanied by changes in molecular parameters in cells, tissues, and serum. The study of the behavior of protein molecules in solutions is of primarily importance for diagnosis of such diseases. Various physical and chemical methods are used to study molecular systems. With the advent of the laser and advances in electronics, optical methods, such as scanning electron microscopy, sedimentation analysis, nephelometry, static and dynamic light scattering, have become the most universal, informative and accurate tools for estimating the parameters of nanoscale objects. The electrophoretic light scattering is the most effective technique. It has a high potential in the study of biological solutions and their properties. This technique allows one to investigate the processes of aggregation and dissociation of different macromolecules and obtain information on their shapes, sizes and molecular weights. Electrophoretic light scattering is an analytical method for registration of the motion of microscopic particles under the influence of an electric field by means of quasi-elastic light scattering in a homogeneous solution with a subsequent registration of the spectral or correlation characteristics of the light scattered from a moving object. We modified the technique by using the regime of total internal reflection with the aim of increasing its sensitivity and reducing the volume of the sample to be investigated, which opens the prospects of automating simultaneous multiparameter measurements. In addition, the method of total internal reflection allows one to study biological fluids on the level of single molecules, which also makes it possible to increase the sensitivity and the informativeness of the results because the data obtained from an individual molecule is not averaged over an ensemble, which is important in the study of bimolecular fluids. To our best knowledge the study of electrophoretic light scattering in the regime of total internal reflection is proposed for the first time, latex microspheres 1 μm in size were used as test objects. In this study, the total internal reflection regime was realized on a quartz prism where the free electrophoresis regime was set. A semiconductor laser with a wavelength of 655 nm was used as a radiation source, and the light scattering signal was registered by a pin-diode. Then the signal from a photodetector was transmitted to a digital oscilloscope and to a computer. The autocorrelation functions and the fast Fourier transform in the regime of Brownian motion and under the action of the field were calculated to obtain the parameters of the object investigated. The main result of the study was the dependence of the autocorrelation function on the concentration of microspheres and the applied field magnitude. The effect of heating became more pronounced with increasing sample concentrations and electric field. The results obtained in our study demonstrated the applicability of the method for the examination of liquid solutions, including biological fluids.

Keywords: light scattering, electrophoretic light scattering, electrophoresis, total internal reflection

Procedia PDF Downloads 205
27921 A Corpus Study of English Verbs in Chinese EFL Learners’ Academic Writing Abstracts

Authors: Shuaili Ji

Abstract:

The correct use of verbs is an important element of high-quality research articles, and thus for Chinese EFL learners, it is significant to master characteristics of verbs and to precisely use verbs. However, some researches have shown that there are differences in using verbs between learners and native speakers and learners have difficulty in using English verbs. This corpus-based quantitative research can enhance learners’ knowledge of English verbs and promote the quality of research article abstracts even of the whole academic writing. The aim of this study is to find the differences between learners’ and native speakers’ use of verbs and to study the factors that contribute to those differences. To this end, the research question is as follows: What are the differences between most frequently used verbs by learners and those by native speakers? The research question is answered through a study that uses corpus-based data-driven approach to analyze the verbs used by learners in their abstract writings in terms of collocation, colligation and semantic prosody. The results show that: (1) EFL learners obviously overused ‘be, can, find, make’ and underused ‘investigate, examine, may’. As to modal verbs, learners obviously overused ‘can’ while underused ‘may’. (2) Learners obviously overused ‘we find + object clauses’ while underused ‘nouns (results, findings, data) + suggest/indicate/reveal + object clauses’ when expressing research results. (3) Learners tended to transfer the collocation, colligation and semantic prosody of shǐ and zuò to make. (4) Learners obviously overused ‘BE+V-ed’ and used BE as the main verb. They also obviously overused the basic forms of BE such as be, is, are, while obviously underused its inflections (was, were). These results manifested learners’ lack of accuracy and idiomatic property in verb usage. Due to the influence of the concept transfer of Chinese, the verbs in learners’ abstracts showed obvious transfer of mother language. In addition, learners have not fully mastered the use of verbs, avoiding using complex colligations to prevent errors. Based on these findings, the present study has implications for English teaching, seeking to have implications for English academic abstract writing in China. Further research could be undertaken to study the use of verbs in the whole dissertation to find out whether the characteristic of the verbs in abstracts can apply in the whole dissertation or not.

Keywords: academic writing abstracts, Chinese EFL learners, corpus-based, data-driven, verbs

Procedia PDF Downloads 322
27920 Pre-Service Teachers’ Conceptual Representations of Heat and Temperature

Authors: Abdeljalil Métioui

Abstract:

The purpose of this paper is to present the results of research on the conceptual representations of 128 Quebec (Canada) pre-service teachers enrolled in their third year of university in a program to train elementary teachers about heat and temperature. To identify their conceptual representations about heat and temperature, we constructed a multiple-choice questionnaire consisting of five questions. For each question, they had to explain their choice of an answer. At the methodological level, this step is essential to be able to identify the student conceptual representations. It should be noted that the selected questions were based: (1) on the works have done worldwide on primary and secondary students’ misconceptions about heat and temperature; (2) on the notions prescribed in the curriculum related to the physical world and (3) on student’s everyday contexts. As illustrations, the following are the erroneous conceptual representations identified in our analysis of the data collected: (1) The change of state of the matter does not require a constant temperature, (2) The temperature is a measure in degrees to indicate the level of heat of an object or person, (3) The mercury contained in a thermometer expands when it is heated so that the particles which constitute it expand and (4) The sensation of cold (or warm) is related to the difference in temperature. In conclusion, we will see that it is possible to develop situations of conflict, dealing specifically with the limits of the analogy between heat and temperature. These situations must consider the conceptual representations of the pre-service teachers, as well as the relevant scientific understanding of the concept of heat and temperature.

Keywords: conceptual representation, heat, temperature, pre-service teachers

Procedia PDF Downloads 125
27919 Segmenting 3D Optical Coherence Tomography Images Using a Kalman Filter

Authors: Deniz Guven, Wil Ward, Jinming Duan, Li Bai

Abstract:

Over the past two decades or so, Optical Coherence Tomography (OCT) has been used to diagnose retina and optic nerve diseases. The retinal nerve fibre layer, for example, is a powerful diagnostic marker for detecting and staging glaucoma. With the advances in optical imaging hardware, the adoption of OCT is now commonplace in clinics. More and more OCT images are being generated, and for these OCT images to have clinical applicability, accurate automated OCT image segmentation software is needed. Oct image segmentation is still an active research area, as OCT images are inherently noisy, with the multiplicative speckling noise. Simple edge detection algorithms are unsuitable for detecting retinal layer boundaries in OCT images. Intensity fluctuation, motion artefact, and the presence of blood vessels also decrease further OCT image quality. In this paper, we introduce a new method for segmenting three-dimensional (3D) OCT images. This involves the use of a Kalman filter, which is commonly used in computer vision for object tracking. The Kalman filter is applied to the 3D OCT image volume to track the retinal layer boundaries through the slices within the volume and thus segmenting the 3D image. Specifically, after some pre-processing of the OCT images, points on the retinal layer boundaries in the first image are identified, and curve fitting is applied to them such that the layer boundaries can be represented by the coefficients of the curve equations. These coefficients then form the state space for the Kalman Filter. The filter then produces an optimal estimate of the current state of the system by updating its previous state using the measurements available in the form of a feedback control loop. The results show that the algorithm can be used to segment the retinal layers in OCT images. One of the limitations of the current algorithm is that the curve representation of the retinal layer boundary does not work well when the layer boundary is split into two, e.g., at the optic nerve, the layer boundary split into two. This maybe resolved by using a different approach to representing the boundaries, such as b-splines or level sets. The use of a Kalman filter shows promise to developing accurate and effective 3D OCT segmentation methods.

Keywords: optical coherence tomography, image segmentation, Kalman filter, object tracking

Procedia PDF Downloads 475
27918 The Impact of the General Data Protection Regulation on Human Resources Management in Schools

Authors: Alexandra Aslanidou

Abstract:

The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.

Keywords: general data protection regulation, human resource management, educational system

Procedia PDF Downloads 94
27917 How Did a Blind Child Begin Understanding Her “Blind Self”?: A Longitudinal Analysis Of Conversation between Her and Adults

Authors: Masahiro Nochi

Abstract:

This study explores the process in which a Japanese child with congenital blindness deepens understanding of the condition of being “unable to see” and develops the idea of “blind self,” despite having no direct experience of vision. The rehabilitation activities of a child with a congenital visual impairment that were video-recorded from 1 to 6 years old were analyzed qualitatively. The duration of the video was about 80 hours. The recordings were transcribed verbatim, and the episodes in which the child used the words related to the act of “looking” were extracted. Detailed transcripts were constructed referencing the notations of conversation analysis. Characteristics of interactions in those episodes were identified and compared longitudinally. Results showed that the child used the expression "look" under certain interaction patterns and her body expressions and interaction with adults developed in conjunction with the development of language use. Four stages were identified. At the age of 1, interactions involving “look” began to occur. The child said "Look" in the sequence: the child’s “Look,” an adult’s “I’m looking,” certain performances by the child, and the adult’s words of praise. At the age of 3, the child began to behave in accordance with the spatial attributes of the act of "looking," such as turning her face to the adult’s voice before saying, “Look.” She also began to use the expression “Keep looking,” which seemed to reflect her understanding of the temporality of the act of “looking.” At the age of 4, the use of “Look” or “Keep looking” became three times more frequent. She also started to refer to the act of looking in the future, such as “Come and look at my puppy someday.” At the age of 5, she moved her hands toward the adults when she was holding something she wanted to show them. She seemed to understand that people could see the object more clearly when it was in close priximity. About that time, she began to say “I cannot see” to her mother, which suggested a heightened understanding of her own blindness. The findings indicate that as she grew up, the child came to utilize nonverbal behavior before and after the order "Look" to make the progress of the interaction with adults even more certain. As a result, actions that reflect the characteristics of the sighted person's visual experience were incorporated into the interaction chain. The purpose of "Look," with which she intended to attract the adult's attention at first, changed and became something that requests a confirmation she was unable to make herself. It is considered that such a change in the use of the word as well as interaction with sighted adults reflected her heightened self-awareness as someone who could not do what sighted people could do easily. A blind child can gradually deepen their understanding of their own characteristics of blindness among sighted people around them. The child can also develop “blind self” by learning how to interact with others even without direct visual experiences.

Keywords: blindness, child development, conversation analysis, self-concept

Procedia PDF Downloads 114
27916 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 80
27915 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs

Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan

Abstract:

Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.

Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour

Procedia PDF Downloads 80
27914 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET

Authors: Tyler T. Procko, Steve Collins

Abstract:

New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.

Keywords: API data access, database, JSON, .NET core, SQL server

Procedia PDF Downloads 59
27913 Study on Renewal Strategy of Old District with an Example of SQ in Shenzhen

Authors: Yun Zuo, Wenju Li

Abstract:

Shenzhen is one of China’s gates to the world. What was once a fishing village is now a metropolis of more than 10 million people. Because of its unprecedented pace of development, it also brings a serious of issues, such as the self-renewal of the city. In the paper, we use Sungang-Quingshuihe(SQ) as an example. SQ is one of the oldest districts in the east of Shenzhen. Nowadays, SQ faces many challenges. This is because once the logistics area has been slowly disappear, the new identity will be replaced. As a result, we are to minimize damages to the city in transforming process by seeking for a new design strategy. In the meantime, we think that each district in a city has its own role forming the whole city together. Therefore, a district transformation is functionally-oriented and for improving city quality in focus.

Keywords: old district, renewal strategy, public space, sustainable development

Procedia PDF Downloads 415
27912 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 75
27911 Literary Interpretation and Systematic-Structural Analysis of the Titles of the Works “The Day Lasts More than a Hundred Years”, “Doomsday”

Authors: Bahor Bahriddinovna Turaeva

Abstract:

The article provides a structural analysis of the titles of the famous Kyrgyz writer Chingiz Aitmatov’s creative works “The Day Lasts More Than a Hundred Years”, “Doomsday”. The author’s creative purpose in naming the work of art, the role of the elements of the plot, and the composition of the novels in revealing the essence of the title are explained. The criteria that are important in naming the author’s works in different genres are classified, and the titles that mean artistic time and artistic space are studied separately. Chronotope is being concerned as the literary-aesthetic category in world literary studies, expressing the scope of the universe interpretation, the author’s outlook and imagination regarding the world foundation, defining personages, and the composition means of expressing the sequence and duration of the events. A creative comprehension of the chronotope as a means of arranging the work composition, structure and constructing an epic field of the text demands a special approach to understanding the aesthetic character of the work. Since the chronotope includes all the elements of a fictional work, it is impossible to present the plot, composition, conflict, system of characters, feelings, and mood of the characters without the description of the chronotope. In the following development of the scientific-theoretical thought in the world, the chronotope is accepted to be one of the poetic means to demonstrate reality as well as to be a literary process that is basic for the expression of reality in the compositional construction and illustration of the plot relying on the writer’s intention and the ideological conception of the literary work. Literary time enables one to cognate the literary world picture created by the author in terms of the descriptive subject and object of the work. Therefore, one of the topical tasks of modern Uzbek literary studies is to describe historical evidence, event, the life of outstanding people, the chronology of the near past based on the literary time; on the example of the creative works of a certain period, creators or an individual writer are analyzed in separate or comparative-typological aspect.

Keywords: novel, title, chronotope, motive, epigraph, analepsis, structural analysis, plot line, composition

Procedia PDF Downloads 70
27910 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices

Authors: Fatemeh Abbasi, Sahand Daneshvar

Abstract:

Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.

Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index

Procedia PDF Downloads 184
27909 STD-NMR Based Protein Engineering of the Unique Arylpropionate-Racemase AMDase G74C

Authors: Sarah Gaßmeyer, Nadine Hülsemann, Raphael Stoll, Kenji Miyamoto, Robert Kourist

Abstract:

Enzymatic racemization allows the smooth interconversion of stereocenters under very mild reaction conditions. Racemases find frequent applications in deracemization and dynamic kinetic resolutions. Arylmalonate decarboxylase (AMDase) from Bordetella Bronchiseptica has high structural similarity to amino acid racemases. These cofactor-free racemases are able to break chemically strong CH-bonds under mild conditions. The racemase-like catalytic machinery of mutant G74C conveys it a unique activity in the racemisation of pharmacologically relevant derivates of 2-phenylpropionic acid (profenes), which makes AMDase G74C an interesting object for the mechanistic investigation of cofactor-independent racemases. Structure-guided protein engineering achieved a variant of this unique racemase with 40-fold increased activity in the racemisation of several arylaliphatic carboxylic acids. By saturation–transfer–difference NMR spectroscopy (STD-NMR), substrate binding during catalysis was investigated. All atoms of the substrate showed interactions with the enzyme. STD-NMR measurements revealed distinct nuclear Overhauser effects in experiments with and without molecular conversion. The spectroscopic analysis led to the identification of several amino acid residues whose variation increased the activity of G74C. While single-amino acid exchanges increased the activity moderately, structure-guided saturation mutagenesis yielded a quadruple mutant with a 40 times higher reaction rate. This study presents STD-NMR as versatile tool for the analysis of enzyme-substrate interactions in catalytically competent systems and for the guidance of protein engineering.

Keywords: racemase, rational protein design, STD-NMR, structure guided saturation mutagenesis

Procedia PDF Downloads 297
27908 Virtual Science Hub: An Open Source Platform to Enrich Science Teaching

Authors: Enrique Barra, Aldo Gordillo, Juan Quemada

Abstract:

This paper presents the Virtual Science Hub platform. It is an open source platform that combines a social network, an e-learning authoring tool, a video conference service and a learning object repository for science teaching enrichment. These four main functionalities fit very well together. The platform was released in April 2012 and since then it has not stopped growing. Finally we present the results of the surveys conducted and the statistics gathered to validate this approach.

Keywords: e-learning, platform, authoring tool, science teaching, educational sciences

Procedia PDF Downloads 383
27907 Assessment of the Spatio-Temporal Distribution of Pteridium aquilinum (Bracken Fern) Invasion on the Grassland Plateau in Nyika National Park

Authors: Andrew Kanzunguze, Lusayo Mwabumba, Jason K. Gilbertson, Dominic B. Gondwe, George Z. Nxumayo

Abstract:

Knowledge about the spatio-temporal distribution of invasive plants in protected areas provides a base from which hypotheses explaining proliferation of plant invasions can be made alongside development of relevant invasive plant monitoring programs. The aim of this study was to investigate the spatio-temporal distribution of bracken fern on the grassland plateau of Nyika National Park over the past 30 years (1986-2016) as well as to determine the current extent of the invasion. Remote sensing, machine learning, and statistical modelling techniques (object-based image analysis, image classification and linear regression analysis) in geographical information systems were used to determine both the spatial and temporal distribution of bracken fern in the study area. Results have revealed that bracken fern has been increasing coverage on the Nyika plateau at an estimated annual rate of 87.3 hectares since 1986. This translates to an estimated net increase of 2,573.1 hectares, which was recorded from 1,788.1 hectares (1986) to 4,361.9 hectares (2016). As of 2017 bracken fern covered 20,940.7 hectares, approximately 14.3% of the entire grassland plateau. Additionally, it was observed that the fern was distributed most densely around Chelinda camp (on the central plateau) as well as in forest verges and roadsides across the plateau. Based on these results it is recommended that Ecological Niche Modelling approaches be employed to (i) isolate the most important factors influencing bracken fern proliferation as well as (ii) identify and prioritize areas requiring immediate control interventions so as to minimize bracken fern proliferation in Nyika National Park.

Keywords: bracken fern, image classification, Landsat-8, Nyika National Park, spatio-temporal distribution

Procedia PDF Downloads 174
27906 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics

Authors: Mikheil Kalmakhelidze

Abstract:

Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.

Keywords: description logic, fuzzy logic, neural networks, record linkage

Procedia PDF Downloads 267
27905 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida

Authors: N. Mehrotra, V. Ahuja, N. Sridharan

Abstract:

Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.

Keywords: disaster, resilience, system, urban

Procedia PDF Downloads 451
27904 Effect of Using PCMs and Transparency Rations on Energy Efficiency and Thermal Performance of Buildings in Hot Climatic Regions. A Simulation-Based Evaluation

Authors: Eda K. Murathan, Gulten Manioglu

Abstract:

In the building design process, reducing heating and cooling energy consumption according to the climatic region conditions of the building are important issues to be considered in order to provide thermal comfort conditions in the indoor environment. Applying a phase-change material (PCM) on the surface of a building envelope is the new approach for controlling heat transfer through the building envelope during the year. The transparency ratios of the window are also the determinants of the amount of solar radiation gain in the space, thus thermal comfort and energy expenditure. In this study, a simulation-based evaluation was carried out by using Energyplus to determine the effect of coupling PCM and transparency ratio when integrated into the building envelope. A three-storey building, a 30m x 30m sized floor area and 10m x 10m sized courtyard are taken as an example of the courtyard building model, which is frequently seen in the traditional architecture of hot climatic regions. 8 zones (10m x10m sized) with 2 exterior façades oriented in different directions on each floor were obtained. The percentage of transparent components on the PCM applied surface was increased at every step (%30, %40, %50). For every zone differently oriented, annual heating, cooling energy consumptions, and thermal comfort based on the Fanger method were calculated. All calculations are made for the zones of the intermediate floor of the building. The study was carried out for Diyarbakır provinces representing the hot-dry climate region and Antalya representing the hot-humid climate region. The increase in the transparency ratio has led to a decrease in heating energy consumption but an increase in cooling energy consumption for both provinces. When PCM is applied to all developed options, It was observed that heating and cooling energy consumption decreased in both Antalya (6.06%-19.78% and %1-%3.74) and Diyarbakır (2.79%-3.43% and 2.32%-4.64%) respectively. When the considered building is evaluated under passive conditions for the 21st of July, which represents the hottest day of the year, it is seen that the user feels comfortable between 11 pm-10 am with the effect of night ventilation for both provinces.

Keywords: building envelope, heating and cooling energy consumptions, phase change material, transparency ratio

Procedia PDF Downloads 168
27903 The Analysis of Indian Culture through the Lexicographical Discourse of Hindi-French Dictionary

Authors: Tanzil Ansari

Abstract:

A dictionary is often considered as a list of words, arranged in alphabetical orders, providing information on a language or languages and it informs us about the spelling, the pronunciation, the origin, the gender and the grammatical functions of new and unknown words. In other words, it is first and foremost a linguistic tool. But, the research across the world in the field of linguistic and lexicography proved that a dictionary is not only a linguistic tool but also a cultural product through which a lexicographer transmits the culture of a country or a linguistic community from his or her ideology. It means, a dictionary does not present only language and its metalinguistic functions but also its culture. Every language consists of some words and expressions which depict the culture of its language. In this way, it is impossible to disassociate language from its culture. There is always an ideology that plays an important role in the depiction of any culture. Using the orientalism theory of Edward Said to represent the east, the objective of the present research is to study the representation of Indian culture through the lexicographical discourse of Hindi-French Dictionary of Federica Boschetti, a French lexicographer. The results show that the Indian culture is stereotypical and monolithic. It also shows India as male oriented country where women are exploited by male-dominated society. The study is focused on Hindi-French dictionary, but its line of argument can be compared to dictionaries produced in other languages.

Keywords: culture, dictionary, lexicographical discourse, stereotype image

Procedia PDF Downloads 292