Search results for: incidental information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13648

Search results for: incidental information processing

13078 Human Machine Interface for Controlling a Robot Using Image Processing

Authors: Ambuj Kumar Gautam, V. Vasu

Abstract:

This paper introduces a head movement based Human Machine Interface (HMI) that uses the right and left movements of head to control a robot motion. Here we present an approach for making an effective technique for real-time face orientation information system, to control a robot which can be efficiently used for Electrical Powered Wheelchair (EPW). Basically this project aims at application related to HMI. The system (machine) identifies the orientation of the face movement with respect to the pixel values of image in a certain areas. Initially we take an image and divide that whole image into three parts on the basis of its number of columns. On the basis of orientation of face, maximum pixel value of approximate same range of (R, G, and B value of a pixel) lie in one of divided parts of image. This information we transfer to the microcontroller through serial communication port and control the motion of robot like forward motion, left and right turn and stop in real time by using head movements.

Keywords: electrical powered wheelchair (EPW), human machine interface (HMI), robotics, microcontroller

Procedia PDF Downloads 292
13077 Implementation of a Method of Crater Detection Using Principal Component Analysis in FPGA

Authors: Izuru Nomura, Tatsuya Takino, Yuji Kageyama, Shin Nagata, Hiroyuki Kamata

Abstract:

We propose a method of crater detection from the image of the lunar surface captured by the small space probe. We use the principal component analysis (PCA) to detect craters. Nevertheless, considering severe environment of the space, it is impossible to use generic computer in practice. Accordingly, we have to implement the method in FPGA. This paper compares FPGA and generic computer by the processing time of a method of crater detection using principal component analysis.

Keywords: crater, PCA, eigenvector, strength value, FPGA, processing time

Procedia PDF Downloads 554
13076 Agricultural Cooperative Model: A Panacea for Economic Development of Small Scale Business Famers in Ilesha, Osun State, Nigeria

Authors: Folasade Adegbaju, Olusola Arowolo, Olufisayo Onawumi

Abstract:

Owolowo ile – ege garri processing industry which is a small scale cassava processing industry, located in Ilesha, Osun State was purposively selected as a case study because it is a cooperative business. This industry was established in 1991 by eight men (8) who were mostly retirees. A researcher made questionnaire was used to collect information from thirty (30) respondents: the manager, four official staffs and 25 randomly selected processors in the industry. The study found that within twelve years of the utilization of their self raised initial capital of N240, 000 naira (Two hundred and forty thousand naira) this cassava – based industry had impacted on and attracted the involvement of many more people because within the period of the study (i.e. 2007-2011) the processors had quadrupled in number (e.g. 8 to 30), the facilities (equipment) in use had increased from one machine and a frying pot to many, this translated into being able to produce large quantities of fried garri, fufu and also starch for marketing to the people in Ilesha and neighbouring cities like Ibadan, Lagos, etc. This is indicative of economic growth. The industry also became a source of employment for community members in the sense that, as at the time of study four staffs were employed to work and coordinate the industry. It was observed that despite all odds of small-scale industry and the problem of people migrating from rural to urban area, this agro-based industry still existed successfully in the community, and many of such industry can be replicated by such agricultural cooperative groups nationwide so as to further boost the productivity as well as the economy of the area and nation at large. However, government and individual still have major roles to play in ensuring the growth and development of the nation in this respect.The local agricultural cooperative groups should form regional cooperative consortium with more networking for the farmers, in order to create more jobs for the young ones and to increase agricultural productivity in the country thus resulting in a better and more sustainable economy.

Keywords: agricultural cooperative, cassava processing industry, model, small scale enterprise

Procedia PDF Downloads 290
13075 Using Technology to Enhance the Student Assessment Experience

Authors: Asim Qayyum, David Smith

Abstract:

The use of information tools is a common activity for students of any educational stage when they encounter online learning activities. Finding the relevant information for particular learning tasks is the topic of this paper as it investigates the use of information tools for a group of student participants. The paper describes and discusses the results with particular implications for use in higher education, and the findings suggest that improvement in assessment design and subsequent student learning may be achieved by structuring the purposefulness of information tools usage and online reading behaviors of university students.

Keywords: information tools, assessment, online learning, student assessment experience

Procedia PDF Downloads 560
13074 A Study on Sentiment Analysis Using Various ML/NLP Models on Historical Data of Indian Leaders

Authors: Sarthak Deshpande, Akshay Patil, Pradip Pandhare, Nikhil Wankhede, Rushali Deshmukh

Abstract:

Among the highly significant duties for any language most effective is the sentiment analysis, which is also a key area of NLP, that recently made impressive strides. There are several models and datasets available for those tasks in popular and commonly used languages like English, Russian, and Spanish. While sentiment analysis research is performed extensively, however it is lagging behind for the regional languages having few resources such as Hindi, Marathi. Marathi is one of the languages that included in the Indian Constitution’s 8th schedule and is the third most widely spoken language in the country and primarily spoken in the Deccan region, which encompasses Maharashtra and Goa. There isn’t sufficient study on sentiment analysis methods based on Marathi text due to lack of available resources, information. Therefore, this project proposes the use of different ML/NLP models for the analysis of Marathi data from the comments below YouTube content, tweets or Instagram posts. We aim to achieve a short and precise analysis and summary of the related data using our dataset (Dates, names, root words) and lexicons to locate exact information.

Keywords: multilingual sentiment analysis, Marathi, natural language processing, text summarization, lexicon-based approaches

Procedia PDF Downloads 74
13073 Expansive-Restrictive Style: Conceptualizing Knowledge Workers

Authors: Ram Manohar Singh, Meenakshi Gupta

Abstract:

Various terms such as ‘learning style’, ‘cognitive style’, ‘conceptual style’, ‘thinking style’, ‘intellectual style’ are used in literature to refer to an individual’s characteristic and consistent approach to organizing and processing information. However, style concepts are criticized for mutually overlapping definitions and confusing classification. This confusion should be addressed at the conceptual as well as empirical level. This paper is an attempt to bridge this gap in literature by proposing a new concept: expansive-restrictive intellectual style based on phenomenological analysis of an auto-ethnography and interview of 26 information technology (IT) professionals working in knowledge intensive organizations (KIOs) in India. Expansive style is an individual’s preference to expand his/her horizon of knowledge and understanding by gaining real meaning and structure of his/her work. On the contrary restrictive style is characterized by an individual’s preference to take minimalist approach at work reflected in executing a job efficiently without an attempt to understand the real meaning and structure of the work. The analysis suggests that expansive-restrictive style has three dimensions: (1) field dependence-independence (2) cognitive involvement and (3) epistemological beliefs.

Keywords: expansive, knowledge workers, restrictive, style

Procedia PDF Downloads 424
13072 Enhancement of Mechanical Properties for Al-Mg-Si Alloy Using Equal Channel Angular Pressing

Authors: W. H. El Garaihy, A. Nassef, S. Samy

Abstract:

Equal channel angular pressing (ECAP) of commercial Al-Mg-Si alloy was conducted using two strain rates. The ECAP processing was conducted at room temperature and at 250 °C. Route A was adopted up to a total number of four passes in the present work. Structural evolution of the aluminum alloy discs was investigated before and after ECAP processing using optical microscopy (OM). Following ECAP, simple compression tests and Vicker’s hardness were performed. OM micrographs showed that, the average grain size of the as-received Al-Mg-Si disc tends to be larger than the size of the ECAP processed discs. Moreover, significant difference in the grain morphologies of the as-received and processed discs was observed. Intensity of deformation was observed via the alignment of the Al-Mg-Si consolidated particles (grains) in the direction of shear, which increased with increasing the number of passes via ECAP. Increasing the number of passes up to 4 resulted in increasing the grains aspect ratio up to ~5. It was found that the pressing temperature has a significant influence on the microstructure, Hv-values, and compressive strength of the processed discs. Hardness measurements demonstrated that 1-pass resulted in increase of Hv-value by 42% compared to that of the as-received alloy. 4-passes of ECAP processing resulted in additional increase in the Hv-value. A similar trend was observed for the yield and compressive strength. Experimental data of the Hv-values demonstrated that there is a lack of any significant dependence on the processing strain rate.

Keywords: Al-Mg-Si alloy, equal channel angular pressing, grain refinement, severe plastic deformation

Procedia PDF Downloads 435
13071 Data Analysis Tool for Predicting Water Scarcity in Industry

Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse

Abstract:

Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.

Keywords: data mining, industry, machine Learning, shortage, water resources

Procedia PDF Downloads 121
13070 Dairy Value Chain: Assessing the Inter Linkage of Dairy Farm and Small-Scale Dairy Processing in Tigray: Case Study of Mekelle City

Authors: Weldeabrha Kiros Kidanemaryam, DepaTesfay Kelali Gidey, Yikaalo Welu Kidanemariam

Abstract:

Dairy services are considered as sources of income, employment, nutrition and health for smallholder rural and urban farmers. The main objective of this study is to assess the interlinkage of dairy farms and small-scale dairy processing in Mekelle, Tigray. To achieve the stated objective, a descriptive research approach was employed where data was collected from 45 dairy farmers and 40 small-scale processors and analyzed by calculating the mean values and percentages. Findings show that the dairy business in the study area is characterized by a shortage of feed and water for the farm. The dairy farm is dominated by breeds of hybrid type, followed by the so called ‘begait’. Though the farms have access to medication and vaccination for the cattle, they fell short of hygiene practices, reliable shade for the cattle and separate space for the claves. The value chain at the milk production stage is characterized by a low production rate, selling raw milk without adding value and a very meager traditional processing practice. Furthermore, small-scale milk processors are characterized by collecting milk from farmers and producing cheese, butter, ghee and sour milk. They do not engage in modern milk processing like pasteurized milk, yogurt and table butter. Most small-scale milk processors are engaged in traditional production systems. Additionally, the milk consumption and marketing part of the chain is dominated by the informal market (channel), where market problems, lack of skill and technology, shortage of loans and weak policy support are being faced as the main challenges. Based on the findings, recommendations and future research areas are forwarded.

Keywords: value-chain, dairy, milk production, milk processing

Procedia PDF Downloads 32
13069 Algorithm for Information Retrieval Optimization

Authors: Kehinde K. Agbele, Kehinde Daniel Aruleba, Eniafe F. Ayetiran

Abstract:

When using Information Retrieval Systems (IRS), users often present search queries made of ad-hoc keywords. It is then up to the IRS to obtain a precise representation of the user’s information need and the context of the information. This paper investigates optimization of IRS to individual information needs in order of relevance. The study addressed development of algorithms that optimize the ranking of documents retrieved from IRS. This study discusses and describes a Document Ranking Optimization (DROPT) algorithm for information retrieval (IR) in an Internet-based or designated databases environment. Conversely, as the volume of information available online and in designated databases is growing continuously, ranking algorithms can play a major role in the context of search results. In this paper, a DROPT technique for documents retrieved from a corpus is developed with respect to document index keywords and the query vectors. This is based on calculating the weight (

Keywords: information retrieval, document relevance, performance measures, personalization

Procedia PDF Downloads 241
13068 How Information Sharing Can Improve Organizational Performance?

Authors: Syed Abdul Rehman Khan

Abstract:

In today’s world, information sharing plays a vital role in successful operations of supply chain; and boost to the profitability of the organizations (end-to-end supply chains). Many researches have been completed over the role of information sharing in supply chain. In this research article, we will investigate the ‘how information sharing can boost profitability & productivity of the organization; for this purpose, we have developed one conceptual model and check to that model through collected data from companies. We sent questionnaire to 369 companies; and will filled form received from 172 firms and the response rate was almost 47%. For the data analysis, we have used Regression in (SPSS software) In the research findings, our all hypothesis has been accepted significantly and due to the information sharing between suppliers and manufacturers ‘quality of material and timely delivery’ increase and also ‘collaboration & trust’ will become more stronger and these all factors will lead to the company’s profitability directly and in-directly. But unfortunately, companies could not avail the all fruitful benefits of information sharing due to the fear of ‘compromise confidentiality or leakage of information’.

Keywords: collaboration, information sharing, risk factor, timely delivery

Procedia PDF Downloads 417
13067 Systematic Literature Review of Therapeutic Use of Autonomous Sensory Meridian Response (ASMR) and Short-Term ASMR Auditory Training Trial

Authors: Christine H. Cubelo

Abstract:

This study consists of 2-parts: a systematic review of current publications on the therapeutic use of autonomous sensory meridian response (ASMR) and a within-subjects auditory training trial using ASMR videos. The main intent is to explore ASMR as potentially therapeutically beneficial for those with atypical sensory processing. Many hearing-related disorders and mood or anxiety symptoms overlap with symptoms of sensory processing issues. For this reason, inclusion and exclusion criteria of the systematic review were generated in an effort to produce optimal search outcomes and avoid overly confined criteria that would limit yielded results. Criteria for inclusion in the review for Part 1 are (1) adult participants diagnosed with hearing loss or atypical sensory processing, (2) inclusion of measures related to ASMR as a treatment method, and (3) published between 2000 and 2022. A total of 1,088 publications were found in the preliminary search, and a total of 13 articles met the inclusion criteria. A total of 14 participants completed the trial and post-trial questionnaire. Of all responses, 64.29% agreed that the duration of auditory training sessions was reasonable. In addition, 71.43% agreed that the training improved their perception of music. Lastly, 64.29% agreed that the training improved their perception of a primary talker when there are other talkers or background noises present.

Keywords: autonomous sensory meridian response, auditory training, atypical sensory processing, hearing loss, hearing aids

Procedia PDF Downloads 55
13066 Robustness of MIMO-OFDM Schemes for Future Digital TV to Carrier Frequency Offset

Authors: D. Sankara Reddy, T. Kranthi Kumar, K. Sreevani

Abstract:

This paper investigates the impact of carrier frequency offset (CFO) on the performance of different MIMO-OFDM schemes with high spectral efficiency for next generation of terrestrial digital TV. We show that all studied MIMO-OFDM schemes are sensitive to CFO when it is greater than 1% of intercarrier spacing. We show also that the Alamouti scheme is the most sensitive MIMO scheme to CFO.

Keywords: modulation and multiplexing (MIMO-OFDM), signal processing for transmission carrier frequency offset, future digital TV, imaging and signal processing

Procedia PDF Downloads 487
13065 Iris Cancer Detection System Using Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.

Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera

Procedia PDF Downloads 503
13064 Robust Image Registration Based on an Adaptive Normalized Mutual Information Metric

Authors: Huda Algharib, Amal Algharib, Hanan Algharib, Ali Mohammad Alqudah

Abstract:

Image registration is an important topic for many imaging systems and computer vision applications. The standard image registration techniques such as Mutual information/ Normalized mutual information -based methods have a limited performance because they do not consider the spatial information or the relationships between the neighbouring pixels or voxels. In addition, the amount of image noise may significantly affect the registration accuracy. Therefore, this paper proposes an efficient method that explicitly considers the relationships between the adjacent pixels, where the gradient information of the reference and scene images is extracted first, and then the cosine similarity of the extracted gradient information is computed and used to improve the accuracy of the standard normalized mutual information measure. Our experimental results on different data types (i.e. CT, MRI and thermal images) show that the proposed method outperforms a number of image registration techniques in terms of the accuracy.

Keywords: image registration, mutual information, image gradients, image transformations

Procedia PDF Downloads 248
13063 On Exploring Search Heuristics for improving the efficiency in Web Information Extraction

Authors: Patricia Jiménez, Rafael Corchuelo

Abstract:

Nowadays the World Wide Web is the most popular source of information that relies on billions of on-line documents. Web mining is used to crawl through these documents, collect the information of interest and process it by applying data mining tools in order to use the gathered information in the best interest of a business, what enables companies to promote theirs. Unfortunately, it is not easy to extract the information a web site provides automatically when it lacks an API that allows to transform the user-friendly data provided in web documents into a structured format that is machine-readable. Rule-based information extractors are the tools intended to extract the information of interest automatically and offer it in a structured format that allow mining tools to process it. However, the performance of an information extractor strongly depends on the search heuristic employed since bad choices regarding how to learn a rule may easily result in loss of effectiveness and/or efficiency. Improving search heuristics regarding efficiency is of uttermost importance in the field of Web Information Extraction since typical datasets are very large. In this paper, we employ an information extractor based on a classical top-down algorithm that uses the so-called Information Gain heuristic introduced by Quinlan and Cameron-Jones. Unfortunately, the Information Gain relies on some well-known problems so we analyse an intuitive alternative, Termini, that is clearly more efficient; we also analyse other proposals in the literature and conclude that none of them outperforms the previous alternative.

Keywords: information extraction, search heuristics, semi-structured documents, web mining.

Procedia PDF Downloads 335
13062 Exploring the Potential of Replika: An AI Chatbot for Mental Health Support

Authors: Nashwah Alnajjar

Abstract:

This research paper provides an overview of Replika, an AI chatbot application that uses natural language processing technology to engage in conversations with users. The app was developed to provide users with a virtual AI friend who can converse with them on various topics, including mental health. This study explores the experiences of Replika users using quantitative research methodology. A survey was conducted with 12 participants to collect data on their demographics, usage patterns, and experiences with the Replika app. The results showed that Replika has the potential to play a role in mental health support and well-being.

Keywords: Replika, chatbot, mental health, artificial intelligence, natural language processing

Procedia PDF Downloads 86
13061 Waste Derived from Refinery and Petrochemical Plants Activities: Processing of Oil Sludge through Thermal Desorption

Authors: Anna Bohers, Emília Hroncová, Juraj Ladomerský

Abstract:

Oil sludge with its main characteristic of high acidity is a waste product generated from the operation of refinery and petrochemical plants. Former refinery and petrochemical plant - Petrochema Dubová is present in Slovakia as well. Its activities was to process the crude oil through sulfonation and adsorption technology for production of lubricating and special oils, synthetic detergents and special white oils for cosmetic and medical purposes. Seventy years ago – period, when this historical acid sludge burden has been created – comparing to the environmental awareness the production was in preference. That is the reason why, as in many countries, also in Slovakia a historical environmental burden is present until now – 229 211 m3 of oil sludge in the middle of the National Park of Nízke Tatry mountain chain. Neither one of tried treatment methods – bio or non-biologic one - was proved as suitable for processing or for recovery in the reason of different factors admission: i.e. strong aggressivity, difficulty with handling because of its sludgy and liquid state et sim. As a potential solution, also incineration was tested, but it was not proven as a suitable method, as the concentration of SO2 in combustion gases was too high, and it was not possible to decrease it under the acceptable value of 2000 mg.mn-3. That is the reason why the operation of incineration plant has been terminated, and the acid sludge landfills are present until nowadays. The objective of this paper is to present a new possibility of processing and valorization of acid sludgy-waste. The processing of oil sludge was performed through the effective separation - thermal desorption technology, through which it is possible to split the sludgy material into the matrix (soil, sediments) and organic contaminants. In order to boost the efficiency in the processing of acid sludge through thermal desorption, the work will present the possibility of application of an original technology – Method of Blowing Decomposition for recovering of organic matter into technological lubricating oil.

Keywords: hazardous waste, oil sludge, remediation, thermal desorption

Procedia PDF Downloads 200
13060 Global Mittag-Leffler Stability of Fractional-Order Bidirectional Associative Memory Neural Network with Discrete and Distributed Transmission Delays

Authors: Swati Tyagi, Syed Abbas

Abstract:

Fractional-order Hopfield neural networks are generally used to model the information processing among the interacting neurons. To show the constancy of the processed information, it is required to analyze the stability of these systems. In this work, we perform Mittag-Leffler stability for the corresponding Caputo fractional-order bidirectional associative memory (BAM) neural networks with various time-delays. We derive sufficient conditions to ensure the existence and uniqueness of the equilibrium point by using the theory of topological degree theory. By applying the fractional Lyapunov method and Mittag-Leffler functions, we derive sufficient conditions for the global Mittag-Leffler stability, which further imply the global asymptotic stability of the network equilibrium. Finally, we present two suitable examples to show the effectiveness of the obtained results.

Keywords: bidirectional associative memory neural network, existence and uniqueness, fractional-order, Lyapunov function, Mittag-Leffler stability

Procedia PDF Downloads 364
13059 Design of Incident Information System in IoT Virtualization Platform

Authors: Amon Olimov, Umarov Jamshid, Dae-Ho Kim, Chol-U Lee, Ryum-Duck Oh

Abstract:

This paper proposes IoT virtualization platform based incident information system. IoT information based environment is the platform that was developed for the purpose of collecting a variety of data by managing regionally scattered IoT devices easily and conveniently in addition to analyzing data collected from roads. Moreover, this paper configured the platform for the purpose of providing incident information based on sensed data. It also provides the same input/output interface as UNIX and Linux by means of matching IoT devices with the directory of file system and also the files. In addition, it has a variety of approaches as to the devices. Thus, it can be applied to not only incident information but also other platforms. This paper proposes the incident information system that identifies and provides various data in real time as to urgent matters on roads based on the existing USN/M2M and IoT visualization platform.

Keywords: incident information system, IoT, virtualization platform, USN, M2M

Procedia PDF Downloads 351
13058 Development of Enhanced Data Encryption Standard

Authors: Benjamin Okike

Abstract:

There is a need to hide information along the superhighway. Today, information relating to the survival of individuals, organizations, or government agencies is transmitted from one point to another. Adversaries are always on the watch along the superhighway to intercept any information that would enable them to inflict psychological ‘injuries’ to their victims. But with information encryption, this can be prevented completely or at worst reduced to the barest minimum. There is no doubt that so many encryption techniques have been proposed, and some of them are already being implemented. However, adversaries always discover loopholes on them to perpetuate their evil plans. In this work, we propose the enhanced data encryption standard (EDES) that would deploy randomly generated numbers as an encryption method. Each time encryption is to be carried out, a new set of random numbers would be generated, thereby making it almost impossible for cryptanalysts to decrypt any information encrypted with this newly proposed method.

Keywords: encryption, enhanced data encryption, encryption techniques, information security

Procedia PDF Downloads 150
13057 Trust: The Enabler of Knowledge-Sharing Culture in an Informal Setting

Authors: Emmanuel Ukpe, S. M. F. D. Syed Mustapha

Abstract:

Trust in an organization has been perceived as one of the key factors behind knowledge sharing, mainly in an unstructured work environment. In an informal working environment, to instill trust among individuals is a challenge and even more in the virtual environment. The study has contributed in developing the framework for building trust in an unstructured organization in performing knowledge sharing in a virtual environment. The artifact called KAPE (Knowledge Acquisition, Processing, and Exchange) was developed for knowledge sharing for the informal organization where the framework was incorporated. It applies to Cassava farmers to facilitate knowledge sharing using web-based platform. A survey was conducted; data were collected from 382 farmers from 21 farm communities. Multiple regression technique, Cronbach’s Alpha reliability test; Tukey’s Honestly significant difference (HSD) analysis; one way Analysis of Variance (ANOVA), and all trust acceptable measures (TAM) were used to test the hypothesis and to determine noteworthy relationships. The results show a significant difference when there is a trust in knowledge sharing between farmers, the ones who have high in trust acceptable factors found in the model (M = 3.66 SD = .93) and the ones who have low on trust acceptable factors (M = 2.08 SD = .28), (t (48) = 5.69, p = .00). Furthermore, when applying Cognitive Expectancy Theory, the farmers with cognitive-consonance show higher level of trust and satisfaction with knowledge and information from KAPE, as compared with a low level of cognitive-dissonance. These results imply that the adopted trust model KAPE positively improved knowledge sharing activities in an informal environment amongst rural farmers.

Keywords: trust, knowledge, sharing, knowledge acquisition, processing and exchange, KAPE

Procedia PDF Downloads 120
13056 [Keynote Speech]: Bridge Damage Detection Using Frequency Response Function

Authors: Ahmed Noor Al-Qayyim

Abstract:

During the past decades, the bridge structures are considered very important portions of transportation networks, due to the fast urban sprawling. With the failure of bridges that under operating conditions lead to focus on updating the default bridge inspection methodology. The structures health monitoring (SHM) using the vibration response appeared as a promising method to evaluate the condition of structures. The rapid development in the sensors technology and the condition assessment techniques based on the vibration-based damage detection made the SHM an efficient and economical ways to assess the bridges. SHM is set to assess state and expects probable failures of designated bridges. In this paper, a presentation for Frequency Response function method that uses the captured vibration test information of structures to evaluate the structure condition. Furthermore, the main steps of the assessment of bridge using the vibration information are presented. The Frequency Response function method is applied to the experimental data of a full-scale bridge.

Keywords: bridge assessment, health monitoring, damage detection, frequency response function (FRF), signal processing, structure identification

Procedia PDF Downloads 347
13055 Educational Related Information Technology Department Transformation: A Case Study

Authors: P. Joongsiri, K. Pattanapisuth, P. Siwatintuko, S. Vasupongayya

Abstract:

This paper presents a case study of developing a four-year plan for the information technology department at the Faculty of Engineering, Prince of Songkla University, Thailand. This work can be used as a case study for other in-house information technology department in a higher educational environment. The result of this paper is the guideline of the four year plan creation process which is generated by analyzing the related theories and several best practices.

Keywords: strategic plan, management information system, information technology department governance, best practices, organization transformation

Procedia PDF Downloads 458
13054 Extraction of Urban Building Damage Using Spectral, Height and Corner Information

Authors: X. Wang

Abstract:

Timely and accurate information on urban building damage caused by earthquake is important basis for disaster assessment and emergency relief. Very high resolution (VHR) remotely sensed imagery containing abundant fine-scale information offers a large quantity of data for detecting and assessing urban building damage in the aftermath of earthquake disasters. However, the accuracy obtained using spectral features alone is comparatively low, since building damage, intact buildings and pavements are spectrally similar. Therefore, it is of great significance to detect urban building damage effectively using multi-source data. Considering that in general height or geometric structure of buildings change dramatically in the devastated areas, a novel multi-stage urban building damage detection method, using bi-temporal spectral, height and corner information, was proposed in this study. The pre-event height information was generated using stereo VHR images acquired from two different satellites, while the post-event height information was produced from airborne LiDAR data. The corner information was extracted from pre- and post-event panchromatic images. The proposed method can be summarized as follows. To reduce the classification errors caused by spectral similarity and errors in extracting height information, ground surface, shadows, and vegetation were first extracted using the post-event VHR image and height data and were masked out. Two different types of building damage were then extracted from the remaining areas: the height difference between pre- and post-event was used for detecting building damage showing significant height change; the difference in the density of corners between pre- and post-event was used for extracting building damage showing drastic change in geometric structure. The initial building damage result was generated by combining above two building damage results. Finally, a post-processing procedure was adopted to refine the obtained initial result. The proposed method was quantitatively evaluated and compared to two existing methods in Port au Prince, Haiti, which was heavily hit by an earthquake in January 2010, using pre-event GeoEye-1 image, pre-event WorldView-2 image, post-event QuickBird image and post-event LiDAR data. The results showed that the method proposed in this study significantly outperformed the two comparative methods in terms of urban building damage extraction accuracy. The proposed method provides a fast and reliable method to detect urban building collapse, which is also applicable to relevant applications.

Keywords: building damage, corner, earthquake, height, very high resolution (VHR)

Procedia PDF Downloads 213
13053 Development of Concurrent Engineering through the Application of Software Simulations of Metal Production Processing and Analysis of the Effects of Application

Authors: D. M. Eric, D. Milosevic, F. D. Eric

Abstract:

Concurrent engineering technologies are a modern concept in manufacturing engineering. One of the key goals in designing modern technological processes is further reduction of production costs, both in the prototype and the preparatory part, as well as during the serial production. Thanks to many segments of concurrent engineering, these goals can be accomplished much more easily. In this paper, we give an overview of the advantages of using modern software simulations in relation to the classical aspects of designing technological processes of metal deformation. Significant savings are achieved thanks to the electronic simulation and software detection of all possible irregularities in the functional-working regime of the technological process. In order for the expected results to be optimal, it is necessary that the input parameters are very objective and that they reliably represent the values ​of these parameters in real conditions. Since it is a metal deformation treatment here, the particularly important parameters are the coefficient of internal friction between the working material and the tools, as well as the parameters related to the flow curve of the processing material. The paper will give a presentation for the experimental determination of some of these parameters.

Keywords: production technologies, metal processing, software simulations, effects of application

Procedia PDF Downloads 235
13052 Plasma Technology for Hazardous Biomedical Waste Treatment

Authors: V. E. Messerle, A. L. Mosse, O. A. Lavrichshev, A. N. Nikonchuk, A. B. Ustimenko

Abstract:

One of the most serious environmental problems today is pollution by biomedical waste (BMW), which in most cases has undesirable properties such as toxicity, carcinogenicity, mutagenicity, fire. Sanitary and hygienic survey of typical solid BMW, made in Belarus, Kazakhstan, Russia and other countries shows that their risk to the environment is significantly higher than that of most chemical wastes. Utilization of toxic BMW requires use of the most universal methods to ensure disinfection and disposal of any of their components. Such technology is a plasma technology of BMW processing. To implement this technology a thermodynamic analysis of the plasma processing of BMW was fulfilled and plasma-box furnace was developed. The studies have been conducted on the example of the processing of bone. To perform thermodynamic calculations software package Terra was used. Calculations were carried out in the temperature range 300 - 3000 K and a pressure of 0.1 MPa. It is shown that the final products do not contain toxic substances. From the organic mass of BMW synthesis gas containing combustible components 77.4-84.6% was basically produced, and mineral part consists mainly of calcium oxide and contains no carbon. Degree of gasification of carbon reaches 100% by the temperature 1250 K. Specific power consumption for BMW processing increases with the temperature throughout its range and reaches 1 kWh/kg. To realize plasma processing of BMW experimental installation with DC plasma torch of 30 kW power was developed. The experiments allowed verifying the thermodynamic calculations. Wastes are packed in boxes weighing 5-7 kg. They are placed in the box furnace. Under the influence of air plasma flame average temperature in the box reaches 1800 OC, the organic part of the waste is gasified and inorganic part of the waste is melted. The resulting synthesis gas is continuously withdrawn from the unit through the cooling and cleaning system. Molten mineral part of the waste is removed from the furnace after it has been stopped. Experimental studies allowed determining operating modes of the plasma box furnace, the exhaust gases was analyzed, samples of condensed products were assembled and their chemical composition was determined. Gas at the outlet of the plasma box furnace has the following composition (vol.%): CO - 63.4, H2 - 6.2, N2 - 29.6, S - 0.8. The total concentration of synthesis gas (CO + H2) is 69.6%, which agrees well with the thermodynamic calculation. Experiments confirmed absence of the toxic substances in the final products.

Keywords: biomedical waste, box furnace, plasma torch, processing, synthesis gas

Procedia PDF Downloads 525
13051 Explaining the Steps of Designing and Calculating the Content Validity Ratio Index of the Screening Checklist of Preschool Students (5 to 7 Years Old) Exposed to Learning Difficulties

Authors: Sajed Yaghoubnezhad, Sedygheh Rezai

Abstract:

Background and Aim: Since currently in Iran, students with learning disabilities are identified after entering school, and with the approach to the gap between IQ and academic achievement, the purpose of this study is to design and calculate the content validity of the pre-school screening checklist (5-7) exposed to learning difficulties. Methods: This research is a fundamental study, and in terms of data collection method, it is quantitative research with a descriptive approach. In order to design this checklist, after reviewing the research background and theoretical foundations, cognitive abilities (visual processing, auditory processing, phonological awareness, executive functions, spatial visual working memory and fine motor skills) are considered the basic variables of school learning. The basic items and worksheets of the screening checklist of pre-school students 5 to 7 years old with learning difficulties were compiled based on the mentioned abilities and were provided to the specialists in order to calculate the content validity ratio index. Results: Based on the results of the table, the validity of the CVR index of the background information checklist is equal to 0.9, and the CVR index of the performance checklist of preschool children (5 to7 years) is equal to 0.78. In general, the CVR index of this checklist is reported to be 0.84. The results of this study provide good evidence for the validity of the pre-school sieve screening checklist (5-7) exposed to learning difficulties.

Keywords: checklist, screening, preschoolers, learning difficulties

Procedia PDF Downloads 102
13050 Optimising Transcranial Alternating Current Stimulation

Authors: Robert Lenzie

Abstract:

Transcranial electrical stimulation (tES) is significant in the research literature. However, the effects of tES on brain activity are still poorly understood at the surface level, the Brodmann Area level, and the impact on neural networks. Using a method like electroencephalography (EEG) in conjunction with tES might make it possible to comprehend the brain response and mechanisms behind published observed alterations in more depth. Using a method to directly see the effect of tES on EEG may offer high temporal resolution data on the brain activity changes/modulations brought on by tES that correlate to various processing stages within the brain. This paper provides unpublished information on a cutting-edge methodology that may reveal details about the dynamics of how the human brain works beyond what is now achievable with existing methods.

Keywords: tACS, frequency, EEG, optimal

Procedia PDF Downloads 81
13049 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores

Authors: A. Ashraff

Abstract:

The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.

Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems

Procedia PDF Downloads 106