Search results for: incidental information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13665

Search results for: incidental information processing

11385 The Role of Journalism in Society, Informing, Educating, and Holding Power Accountable within the Yaoundé Region of Cameroon

Authors: Ita Noh Nkwain

Abstract:

Journalism plays a critical role in today's society by providing accurate and reliable information to the public. Through various mediums such as print, television, and online news outlets, journalists inform and educate the public on important issues and events happening around the world. Additionally, journalism serves as a watchdog by holding those in power accountable for their actions and decisions. However, with the rise of social media and the decline of traditional news sources, the future of journalism is uncertain. Despite these challenges, the importance of quality journalism cannot be overstated in a world where information is readily available but not always trustworthy.

Keywords: journalism, accountability, education, television, public

Procedia PDF Downloads 41
11384 The Role of Journalism in Society, Informing, Educating, and Holding Power Accountable within the Yaoundé Region of Cameroon

Authors: Ita Noh Nkwain

Abstract:

Journalism plays a critical role in today's society by providing accurate and reliable information to the public. Through various mediums such as print, television, and online news outlets, journalists inform and educate the public on important issues and events happening around the world. Additionally, journalism serves as a watchdog by holding those in power accountable for their actions and decisions. However, with the rise of social media and the decline of traditional news sources, the future of journalism is uncertain. Despite these challenges, the importance of quality journalism cannot be overstated in a world where information is readily available but not always trustworthy.

Keywords: Journalism, accountability, education, television, public

Procedia PDF Downloads 33
11383 Reinforcement Learning For Agile CNC Manufacturing: Optimizing Configurations And Sequencing

Authors: Huan Ting Liao

Abstract:

In a typical manufacturing environment, computer numerical control (CNC) machining is essential for automating production through precise computer-controlled tool operations, significantly enhancing efficiency and ensuring consistent product quality. However, traditional CNC production lines often rely on manual loading and unloading, limiting operational efficiency and scalability. Although automated loading systems have been developed, they frequently lack sufficient intelligence and configuration efficiency, requiring extensive setup adjustments for different products and impacting overall productivity. This research addresses the job shop scheduling problem (JSSP) in CNC machining environments, aiming to minimize total completion time (makespan) and maximize CNC machine utilization. We propose a novel approach using reinforcement learning (RL), specifically the Q-learning algorithm, to optimize scheduling decisions. The study simulates the JSSP, incorporating robotic arm operations, machine processing times, and work order demand allocation to determine optimal processing sequences. The Q-learning algorithm enhances machine utilization by dynamically balancing workloads across CNC machines, adapting to varying job demands and machine states. This approach offers robust solutions for complex manufacturing environments by automating decision-making processes for job assignments. Additionally, we evaluate various layout configurations to identify the most efficient setup. By integrating RL-based scheduling optimization with layout analysis, this research aims to provide a comprehensive solution for improving manufacturing efficiency and productivity in CNC-based job shops. The proposed method's adaptability and automation potential promise significant advancements in tackling dynamic manufacturing challenges.

Keywords: job shop scheduling problem, reinforcement learning, operations sequence, layout optimization, q-learning

Procedia PDF Downloads 25
11382 Geo-Collaboration Model between a City and Its Inhabitants to Develop Complementary Solutions for Better Household Waste Collection

Authors: Abdessalam Hijab, Hafida Boulekbache, Eric Henry

Abstract:

According to several research studies, the city as a whole is a complex, spatially organized system; its modeling must take into account several factors, socio-economic, and political, or geographical, acting at multiple scales of observation according to varied temporalities. Sustainable management and protection of the environment in this complex system require significant human and technical investment, particularly for monitoring and maintenance. The objective of this paper is to propose an intelligent approach based on the coupling of Geographic Information System (GIS) and Information and Communications Technology (ICT) tools in order to integrate the inhabitants in the processes of sustainable management and protection of the urban environment, specifically in the processes of household waste collection in urban areas. We are discussing a collaborative 'city/inhabitant' space. Indeed, it is a geo-collaborative approach, based on the spatialization and real-time geo-localization of topological and multimedia data taken by the 'active' inhabitant, in the form of geo-localized alerts related to household waste issues in their city. Our proposal provides a good understanding of the extent to which civil society (inhabitants) can help and contribute to the development of complementary solutions for the collection of household waste and the protection of the urban environment. Moreover, it allows the inhabitant to contribute to the enrichment of a data bank for future uses. Our geo-collaborative model will be tested in the Lamkansa sampling district of the city of Casablanca in Morocco.

Keywords: geographic information system, GIS, information and communications technology, ICT, geo-collaboration, inhabitants, city

Procedia PDF Downloads 117
11381 Investigating the Encouraging Factors for Scholarly Works Contribution towards Institutional Repository: A Case Study at a Malaysian University

Authors: Mohd Rashid bin Ab Hamid, Noor Azura binti Omar, Zainol Bin Mustafa

Abstract:

Purpose: The aim of this paper is to study the encouraging factors for scholarly works contribution towards among academicians at Malaysian university. Methods: This paper uses questionnaire for data collection on the respondents’ perceptional level on the institutional repository efforts in one of the university under study. Several encouraging factors have been identified and to be measured using descriptive statistics. The factors are related to content contribution, i.e. personal factor, professional factor, organizational factor and technological factor. Findings: The study found that all these four encouraging factors did have a relation to the contribution of scholarly works in the university by the academician. Research Limitations: This study used a case study and generalization to all Malaysian universities should be well taken care of. Practical implications: The library at the university should look into these four encouraging factors in order to enhance the contribution from academician towards the repository. Originality/value: This research paper provides basic information for the knowledge management officers in the university by endeavouring more efforts in order to attract more contributions.

Keywords: institutional repository, information retrieval, information storage and retrieval

Procedia PDF Downloads 561
11380 Intuition in Negotiation within Ghanaian Social Contexts: Exploring Female Leadership Strategies for Conflict Transformation

Authors: Nadia Naadu Nartey, Esther A.O.G. Tetteh

Abstract:

Male negotiator representations and the appreciation of masculine traits in negotiation contexts dominate negotiation research in the field of conflict management and resolution. This study switched focus to pay attention to rarely examined gendered criteria and social contexts in negotiation research by investigating how intuition has been used in negotiations by female leaders toward conflict transformation in Ghanaian social contexts. Using the theoretical lenses of Klein’s Recognition-Primed Decisions (RPD) and Unconscious Information Processing (UIP) models, this study employs narrative inquiry in qualitative research. Semi-structured interviews of five (5) female leaders of Ghanaian social contexts in the United States (US) revealed that the use of intuition is necessary for effective negotiation outcomes due to its primal focus on relationship-building toward transforming conflicts. The knowledge added to the body of research by this study is summed up in the study’s conceptual framework. Female leaders, in negotiation situations where there are conflicting parties, prioritize the greater need for stronger relationships and win-win outcomes. The participant female leaders in negotiation contexts utilize their intuition as a bonding mechanism by effectively timing their actions, using an appropriate communication tone, emphasizing relationship building, and drawing from experience to make sound situational judgments (as in assessing a situation in the RPD model). Female leaders’ use of intuition in negotiations then translates to creating a force that bridges the gap between the conflicting parties. That force is noticed as conflict transformation that manifests as a reduction in anger and a promotion of trust and mutual understanding toward strengthening relationships. Future studies can expand the scope of the findings of this research by conducting a comparative analysis between male and female leaders on their use of intuition in negotiations in Ghanaian contexts.

Keywords: intuition, negotiation, conflict transformation, female leaders, ghanaian social contexts

Procedia PDF Downloads 14
11379 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 475
11378 Building Information Modeling Acting as Protagonist and Link between the Virtual Environment and the Real-World for Efficiency in Building Production

Authors: Cristiane R. Magalhaes

Abstract:

Advances in Information and Communication Technologies (ICT) have led to changes in different sectors particularly in architecture, engineering, construction, and operation (AECO) industry. In this context, the advent of BIM (Building Information Modeling) has brought a number of opportunities in the field of the digital architectural design process bringing integrated design concepts that impact on the development, elaboration, coordination, and management of ventures. The project scope has begun to contemplate, from its original stage, the third dimension, by means of virtual environments (VEs), composed of models containing different specialties, substituting the two-dimensional products. The possibility to simulate the construction process of a venture in a VE starts at the beginning of the design process offering, through new technologies, many possibilities beyond geometrical digital modeling. This is a significant change and relates not only to form, but also to how information is appropriated in architectural and engineering models and exchanged among professionals. In order to achieve the main objective of this work, the Design Science Research Method will be adopted to elaborate an artifact containing strategies for the application and use of ICTs from BIM flows, with pre-construction cut-off to the execution of the building. This article intends to discuss and investigate how BIM can be extended to the site acting as a protagonist and link between the Virtual Environments and the Real-World, as well as its contribution to the integration of the value chain and the consequent increase of efficiency in the production of the building. The virtualization of the design process has reached high levels of development through the use of BIM. Therefore it is essential that the lessons learned with the virtual models be transposed to the actual building production increasing precision and efficiency. Thus, this paper discusses how the Fourth Industrial Revolution has impacted on property developments and how BIM could be the propellant acting as the main fuel and link between the virtual environment and the real production for the structuring of flows, information management and efficiency in this process. The results obtained are partial and not definite up to the date of this publication. This research is part of a doctoral thesis development, which focuses on the discussion of the impact of digital transformation in the construction of residential buildings in Brazil.

Keywords: building information modeling, building production, digital transformation, ICT

Procedia PDF Downloads 122
11377 Decision Tree Model for the Recommendation of Digital and Alternate Payment Methods for SMEs

Authors: Arturo J. Anci Alméstar, Jose D. Fernandez Huapaya, David Mauricio

Abstract:

Companies make erroneous decisions by not evaluating the inherent difficulties of entering electronic commerce without a prior review of current digital and alternate means of payment. For this reason, it is very important for businesses to have reliable, complete and integrated information on the means of current digital and alternate payments that allow decisions to be made about which of these to use. However, there is no such consolidated information or criteria that companies use to make decisions about the means of payment according to their needs. In this paper, we propose a decision tree model based on a taxonomy that presents us with a categorization of digital and alternative means of payment, as well as the visualization of the flow of information at a high level from the company to obtain a recommendation. This will allow the company to make the most appropriate decision about the implementation of the digital means of payment or alternative ideal for their needs, which allows a reduction in costs and complexity of the payment process. Likewise, the efficiency of the proposed model was evaluated through a satisfaction survey presented to company personnel, confirming the satisfactory quality level of the recommendations obtained by the model.

Keywords: digital payment medium, decision tree, decision making, digital payments taxonomy

Procedia PDF Downloads 179
11376 Assessment of Digital Literacy Skills of Librarians in Tertiary Institutions Inniger State

Authors: Mustapha Abdulkadir Gana, Jibrin Attahiru Alhassan, Adamu Musa Baba

Abstract:

The exponential growth of information sources, resources and the continued Communication Technology (ICT) sophistication of libraries all over the world call for capable and ICT compliant librarians in Nigeria, this article assesses the digital literacy skills of librarians in tertiary institutions in Niger state. The survey research method was applied in the study using a random sampling technique to draw the sample. Fifty-eight copies of the questionnaire were administered while forty-nine copies were completed, returned, and used in the study, which represents 84% of the response rate. Two research questions were answered, and data were analyzed using Statistical Package for the Social Sciences (SPSS). The finding uncovered that the librarians lack the requisite digital literacy skills to access the wealth of digital information resources available. The study recommends some steps to turn around the situations amongst; librarians must be empowered with all necessary digital literacy skills, embark on rigorous training and retraining programs, workshops, conferences, and seminars, there should also be a coherent training policy for the librarians on a sustainable basis to increase their requisite digital literacy skills.

Keywords: digital, information, literacy, skills

Procedia PDF Downloads 151
11375 The Use of Information and Communication Technologies in Electoral Procedures: Comments on Electronic Voting Security

Authors: Magdalena Musiał-Karg

Abstract:

The expansion of telecommunication and progress of electronic media constitute important elements of our times. The recent worldwide convergence of information and communication technologies (ICT) and dynamic development of the mass media is leading to noticeable changes in the functioning of contemporary states and societies. Currently, modern technologies play more and more important roles and filter down to almost every field of contemporary human life. It results in the growth of online interactions that can be observed by the inconceivable increase in the number of people with home PCs and Internet access. The proof of it is undoubtedly the emergence and use of concepts such as e-society, e-banking, e-services, e-government, e-government, e-participation and e-democracy. The newly coined word e-democracy evidences that modern technologies have also been widely used in politics. Without any doubt in most countries all actors of political market (politicians, political parties, servants in political/public sector, media) use modern forms of communication with the society. Most of these modern technologies progress the processes of getting and sending information to the citizens, communication with the electorate, and also – which seems to be the biggest advantage – electoral procedures. Thanks to implementation of ICT the interaction between politicians and electorate are improved. The main goal of this text is to analyze electronic voting (e-voting) as one of the important forms of electronic democracy in terms of security aspects. The author of this paper aimed at answering the questions of security of electronic voting as an additional form of participation in elections and referenda.

Keywords: electronic democracy, electronic voting, security of e-voting, information and communication technology (ICT)

Procedia PDF Downloads 241
11374 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 260
11373 Virtualizing Attendance and Reducing Impacts on the Environment with a Mobile Application

Authors: Paulo R. M. Andrade, Adriano B. Albuquerque, Otávio F. Frota, Robson V. Silveira, Fátima A. da Silva

Abstract:

Information technology has been gaining more and more space whether in industry, commerce or even for personal use, but the misuse of it brings harm to the environment and human health as a result. Contribute to the sustainability of the planet is to compensate the environment, all or part of what withdraws it. The green computing also came to propose practical for use in IT in an environmentally correct way in aid of strategic management and communication. This work focuses on showing how a mobile application can help businesses reduce costs and reduced environmental impacts caused by its processes, through a case study of a public company in Brazil.

Keywords: green computing, information technology, e-government, sustainable development, mobile computing

Procedia PDF Downloads 419
11372 Knowledge Creation Environment in the Iranian Universities: A Case Study

Authors: Mahdi Shaghaghi, Amir Ghaebi, Fariba Ahmadi

Abstract:

Purpose: The main purpose of the present research is to analyze the knowledge creation environment at a Iranian University (Alzahra University) as a typical University in Iran, using a combination of the i-System and Ba models. This study is necessary for understanding the determinants of knowledge creation at Alzahra University as a typical University in Iran. Methodology: To carry out the present research, which is an applied study in terms of purpose, a descriptive survey method was used. In this study, a combination of the i-System and Ba models has been used to analyze the knowledge creation environment at Alzahra University. i-System consists of 5 constructs including intervention (input), intelligence (process), involvement (process), imagination (process), and integration (output). The Ba environment has three pillars, namely the infrastructure, the agent, and the information. The integration of these two models resulted in 11 constructs which were as follows: intervention (input), infrastructure-intelligence, agent-intelligence, information-intelligence (process); infrastructure-involvement, agent-involvement, information-involvement (process); infrastructure-imagination, agent-imagination, information-imagination (process); and integration (output). These 11 constructs were incorporated into a 52-statement questionnaire and the validity and reliability of the questionnaire were examined and confirmed. The statistical population included the faculty members of Alzahra University (344 people). A total of 181 participants were selected through the stratified random sampling technique. The descriptive statistics, binomial test, regression analysis, and structural equation modeling (SEM) methods were also utilized to analyze the data. Findings: The research findings indicated that among the 11 research constructs, the levels of intervention, information-intelligence, infrastructure-involvement, and agent-imagination constructs were average and not acceptable. The levels of infrastructure-intelligence and information-imagination constructs ranged from average to low. The levels of agent-intelligence and information-involvement constructs were also completely average. The level of infrastructure-imagination construct was average to high and thus was considered acceptable. The levels of agent-involvement and integration constructs were above average and were in a highly acceptable condition. Furthermore, the regression analysis results indicated that only two constructs, viz. the information-imagination and agent-involvement constructs, positively and significantly correlate with the integration construct. The results of the structural equation modeling also revealed that the intervention, intelligence, and involvement constructs are related to the integration construct with the complete mediation of imagination. Discussion and conclusion: The present research suggests that knowledge creation at Alzahra University relatively complies with the combination of the i-System and Ba models. Unlike this model, the intervention, intelligence, and involvement constructs are not directly related to the integration construct and this seems to have three implications: 1) the information sources are not frequently used to assess and identify the research biases; 2) problem finding is probably of less concern at the end of studies and at the time of assessment and validation; 3) the involvement of others has a smaller role in the summarization, assessment, and validation of the research.

Keywords: i-System, Ba model , knowledge creation , knowledge management, knowledge creation environment, Iranian Universities

Procedia PDF Downloads 101
11371 Time Temperature Dependence of Long Fiber Reinforced Polypropylene Manufactured by Direct Long Fiber Thermoplastic Process

Authors: K. A. Weidenmann, M. Grigo, B. Brylka, P. Elsner, T. Böhlke

Abstract:

In order to reduce fuel consumption, the weight of automobiles has to be reduced. Fiber reinforced polymers offer the potential to reach this aim because of their high stiffness to weight ratio. Additionally, the use of fiber reinforced polymers in automotive applications has to allow for an economic large-scale production. In this regard, long fiber reinforced thermoplastics made by direct processing offer both mechanical performance and processability in injection moulding and compression moulding. The work presented in this contribution deals with long glass fiber reinforced polypropylene directly processed in compression moulding (D-LFT). For the use in automotive applications both the temperature and the time dependency of the materials properties have to be investigated to fulfill performance requirements during crash or the demands of service temperatures ranging from -40 °C to 80 °C. To consider both the influence of temperature and time, quasistatic tensile tests have been carried out at different temperatures. These tests have been complemented by high speed tensile tests at different strain rates. As expected, the increase in strain rate results in an increase of the elastic modulus which correlates to an increase of the stiffness with decreasing service temperature. The results are in good accordance with results determined by dynamic mechanical analysis within the range of 0.1 to 100 Hz. The experimental results from different testing methods were grouped and interpreted by using different time temperature shift approaches. In this regard, Williams-Landel-Ferry and Arrhenius approach based on kinetics have been used. As the theoretical shift factor follows an arctan function, an empirical approach was also taken into consideration. It could be shown that this approach describes best the time and temperature superposition for glass fiber reinforced polypropylene manufactured by D-LFT processing.

Keywords: composite, dynamic mechanical analysis, long fibre reinforced thermoplastics, mechanical properties, time temperature superposition

Procedia PDF Downloads 199
11370 A CORDIC Based Design Technique for Efficient Computation of DCT

Authors: Deboraj Muchahary, Amlan Deep Borah Abir J. Mondal, Alak Majumder

Abstract:

A discrete cosine transform (DCT) is described and a technique to compute it using fast Fourier transform (FFT) is developed. In this work, DCT of a finite length sequence is obtained by incorporating CORDIC methodology in radix-2 FFT algorithm. The proposed methodology is simple to comprehend and maintains a regular structure, thereby reducing computational complexity. DCTs are used extensively in the area of digital processing for the purpose of pattern recognition. So the efficient computation of DCT maintaining a transparent design flow is highly solicited.

Keywords: DCT, DFT, CORDIC, FFT

Procedia PDF Downloads 478
11369 Development of the Drug Abuse Health Information System in Thai Community

Authors: Waraporn Boonchieng, Ekkarat Boonchieng, Sivaporn Aungwattana, Decha Tamdee, Wongamporn Pinyavong

Abstract:

Drug addiction represents one of the most important public health issues in both developed and developing countries. The purpose of this study was to develop a drug abuse health information in a community in Northern Thailand using developmental research design. The developmental researchers performed four phases to develop drug abuse health information, including 1) synthesizing knowledge related to drug abuse prevention and identifying the components of drug abuse health information; 2) developing the system in mobile application and website; 3) implementing drug abuse health information in the rural community; and 4) evaluating the feasibility of drug abuse health information. Data collection involved both qualitative and quantitative procedures. The qualitative data and quantitative data were analyzed using content analysis and descriptive statistics, respectively. The findings of this study showed that drug abuse health information consisted of five sections, including drug-related prevention knowledge for teens, drug-related knowledge for adults and professionals, the database for drug dependence treatment centers, self-administered questionnaires, and supportive counseling sections. First, in drug-related prevention knowledge for teens, the developmental researchers designed four infographics and animation to provide drug-related prevention knowledge, including types of illegal drugs, causes of drug abuse, consequences of drug abuse, drug abuse diagnosis and treatment, and drug abuse prevention. Second, in drug-related knowledge for adults and professionals, the developmental researchers developed many documents in a form of PDF file to provide drug-related knowledge, including types of illegal drugs, causes of drug abuse, drug abuse prevention, and relapse prevention guideline. Third, database for drug dependence treatment centers included the place, direction map, operation time, and the way for contacting all drug dependence treatment centers in Thailand. Fourth, self-administered questionnaires comprised preventive drugs behavior questionnaire, drug abuse knowledge questionnaire, the stages of change readiness and treatment eagerness to drug use scale, substance use behaviors questionnaire, tobacco use behaviors questionnaire, stress screening, and depression screening. Finally, for supportive counseling, the developmental researchers designed chatting box through which each user could write and send their concerns to counselors individually. Results from evaluation process showed that 651 participants used drug abuse health information via mobile application and website. Among all users, 48.8% were males and 51.2% were females. More than half (55.3%) were 15-20 years old and most of them (88.0%) were Buddhists. Most users reported ever getting knowledge related to drugs (86.1%), and drinking alcohol (94.2%) while some of them (6.9%) reported ever using tobacco. For satisfaction with using the drug abuse health information, more than half of users reflected that the contents of drug abuse health information were interesting (59%), up-to date (61%), and highly useful to their self-study (59%) at high level. In addition, half of them were satisfied with the design in terms of infographics (54%) and animation (51%). Thus, this drug abuse health information can be adopted to explore drug abuse situation and serves as a tool to prevent drug abuse and addiction among Thai community people.

Keywords: drug addiction, health informatics, big data, development research

Procedia PDF Downloads 112
11368 UNIX Source Code Leak: Evaluation and Feasible Solutions

Authors: Gu Dongxing, Li Yuxuan, Nong Tengxiao, Burra Venkata Durga Kumar

Abstract:

Since computers are widely used in business models, more and more companies choose to store important information in computers to improve productivity. However, this information can be compromised in many cases, such as when it is stored locally on the company's computers or when it is transferred between servers and clients. Of these important information leaks, source code leaks are probably the most costly. Because the source code often represents the core technology of the company, especially for the Internet companies, source code leakage may even lead to the company's core products lose market competitiveness, and then lead to the bankruptcy of the company. In recent years, such as Microsoft, AMD and other large companies have occurred source code leakage events, suffered a huge loss. This reveals to us the importance and necessity of preventing source code leakage. This paper aims to find ways to prevent source code leakage based on the direction of operating system, and based on the fact that most companies use Linux or Linux-like system to realize the interconnection between server and client, to discuss how to reduce the possibility of source code leakage during data transmission.

Keywords: data transmission, Linux, source code, operating system

Procedia PDF Downloads 271
11367 Operating Parameters and Costs Assessments of a Real Fishery Wastewater Effluent Treated by Electrocoagulation Process

Authors: Mirian Graciella Dalla Porta, Humberto Jorge José, Danielle de Bem Luiz, Regina de F. P. M.Moreira

Abstract:

Similar to most processing industries, fish processing produces large volumes of wastewater, which contains especially organic contaminants, salts and oils dispersed therein. Different processes have been used for the treatment of fishery wastewaters, but the most commonly used are chemical coagulation and flotation. These techniques are well known but sometimes the characteristics of the treated effluent do not comply with legal standards for discharge. Electrocoagulation (EC) is an electrochemical process that can be used to treat wastewaters in terms of both organic matter and nutrient removal. The process is based on the use of sacrificial electrodes such as aluminum, iron or zinc, that are oxidized to produce metal ions that can be used to coagulate and react with organic matter and nutrients in the wastewater. While EC processes are effective to treatment of several types of wastewaters, applications have been limited due to the high energy demands and high current densities. Generally, the for EC process can be performed without additional chemicals or pre-treatment, but the costs should be reduced for EC processes to become more applicable. In this work, we studied the treatment of a real wastewater from fishmeal industry by electrocoagulation process. Removal efficiencies for chemical oxygen demand (COD), total organic carbon (TOC) turbidity, phosphorous and nitrogen concentration were determined as a function of the operating conditions, such as pH, current density and operating time. The optimum operating conditions were determined to be operating time of 10 minutes, current density 100 A.m-2, and initial pH 4.0. COD, TOC, phosphorous concentration, and turbidity removal efficiencies at the optimum operating conditions were higher than 90% for aluminum electrode. Operating costs at the optimum conditions were calculated as US$ 0.37/m3 (US$ 0.038/kg COD) for Al electrode. These results demonstrate that the EC process is a promising technology to remove nutrients from fishery wastewaters, as the process has both a high efficiency of nutrient removal, and low energy requirements.

Keywords: electrocoagulation, fish, food industry, wastewater

Procedia PDF Downloads 249
11366 Sorting Fish by Hu Moments

Authors: J. M. Hernández-Ontiveros, E. E. García-Guerrero, E. Inzunza-González, O. R. López-Bonilla

Abstract:

This paper presents the implementation of an algorithm that identifies and accounts different fish species: Catfish, Sea bream, Sawfish, Tilapia, and Totoaba. The main contribution of the method is the fusion of the characteristics of invariance to the position, rotation and scale of the Hu moments, with the proper counting of fish. The identification and counting is performed, from an image under different noise conditions. From the experimental results obtained, it is inferred the potentiality of the proposed algorithm to be applied in different scenarios of aquaculture production.

Keywords: counting fish, digital image processing, invariant moments, pattern recognition

Procedia PDF Downloads 409
11365 Image Segmentation: New Methods

Authors: Flaurence Benjamain, Michel Casperance

Abstract:

We present in this paper, first, a comparative study of three mathematical theories to achieve the fusion of information sources. This study aims to identify the characteristics inherent in theories of possibilities, belief functions (DST) and plausible and paradoxical reasoning to establish a strategy of choice that allows us to adopt the most appropriate theory to solve a problem of fusion in order, taking into account the acquired information and imperfections that accompany them. Using the new theory of plausible and paradoxical reasoning, also called Dezert-Smarandache Theory (DSmT), to fuse information multi-sources needs, at first step, the generation of the composites events witch is, in general, difficult. Thus, we present in this paper a new approach to construct pertinent paradoxical classes based on gray levels histograms, which also allows to reduce the cardinality of the hyper-powerset. Secondly, we developed a new technique for order and coding generalized focal elements. This method is exploited, in particular, to calculate the cardinality of Dezert and Smarandache. Then, we give an experimentation of classification of a remote sensing image that illustrates the given methods and we compared the result obtained by the DSmT with that resulting from the use of the DST and theory of possibilities.

Keywords: segmentation, image, approach, vision computing

Procedia PDF Downloads 276
11364 Elemental Graph Data Model: A Semantic and Topological Representation of Building Elements

Authors: Yasmeen A. S. Essawy, Khaled Nassar

Abstract:

With the rapid increase of complexity in the building industry, professionals in the A/E/C industry were forced to adopt Building Information Modeling (BIM) in order to enhance the communication between the different project stakeholders throughout the project life cycle and create a semantic object-oriented building model that can support geometric-topological analysis of building elements during design and construction. This paper presents a model that extracts topological relationships and geometrical properties of building elements from an existing fully designed BIM, and maps this information into a directed acyclic Elemental Graph Data Model (EGDM). The model incorporates BIM-based search algorithms for automatic deduction of geometrical data and topological relationships for each building element type. Using graph search algorithms, such as Depth First Search (DFS) and topological sortings, all possible construction sequences can be generated and compared against production and construction rules to generate an optimized construction sequence and its associated schedule. The model is implemented in a C# platform.

Keywords: building information modeling (BIM), elemental graph data model (EGDM), geometric and topological data models, graph theory

Procedia PDF Downloads 382
11363 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development

Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas

Abstract:

One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.

Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development

Procedia PDF Downloads 316
11362 An Approach to Autonomous Drones Using Deep Reinforcement Learning and Object Detection

Authors: K. R. Roopesh Bharatwaj, Avinash Maharana, Favour Tobi Aborisade, Roger Young

Abstract:

Presently, there are few cases of complete automation of drones and its allied intelligence capabilities. In essence, the potential of the drone has not yet been fully utilized. This paper presents feasible methods to build an intelligent drone with smart capabilities such as self-driving, and obstacle avoidance. It does this through advanced Reinforcement Learning Techniques and performs object detection using latest advanced algorithms, which are capable of processing light weight models with fast training in real time instances. For the scope of this paper, after researching on the various algorithms and comparing them, we finally implemented the Deep-Q-Networks (DQN) algorithm in the AirSim Simulator. In future works, we plan to implement further advanced self-driving and object detection algorithms, we also plan to implement voice-based speech recognition for the entire drone operation which would provide an option of speech communication between users (People) and the drone in the time of unavoidable circumstances. Thus, making drones an interactive intelligent Robotic Voice Enabled Service Assistant. This proposed drone has a wide scope of usability and is applicable in scenarios such as Disaster management, Air Transport of essentials, Agriculture, Manufacturing, Monitoring people movements in public area, and Defense. Also discussed, is the entire drone communication based on the satellite broadband Internet technology for faster computation and seamless communication service for uninterrupted network during disasters and remote location operations. This paper will explain the feasible algorithms required to go about achieving this goal and is more of a reference paper for future researchers going down this path.

Keywords: convolution neural network, natural language processing, obstacle avoidance, satellite broadband technology, self-driving

Procedia PDF Downloads 251
11361 Discussion on Microstructural Changes Caused by Deposition Temperature of LZO Doped Mg Piezoelectric Films

Authors: Cheng-Ying Li, Sheng-Yuan Chu

Abstract:

This article deposited LZO-doped Mg piezoelectric thin films via RF sputtering and observed microstructure and electrical characteristics by varying the deposition temperature. The XRD analysis results indicate that LZO-doped Mg exhibits excellent (002) orientation, and there is no presence of ZnO(100), Influenced by the temperature's effect on the lattice constant, the (002) peak intensity increases with rising temperature. Finally, we conducted deformation intensity analysis on the films, revealing an over fourfold increase in deformation at a processing temperature of 500°C.

Keywords: RF sputtering, piezoelectricity, ZnO, Mg

Procedia PDF Downloads 43
11360 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks

Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar

Abstract:

DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.

Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)

Procedia PDF Downloads 318
11359 Effect of Roasting Treatment on Milling Quality, Physicochemical, and Bioactive Compounds of Dough Stage Rice Grains

Authors: Chularat Leewuttanakul, Khanitta Ruttarattanamongkol, Sasivimon Chittrakorn

Abstract:

Rice during grain development stage is a rich source of many bioactive compounds. Dough stage rice contains high amounts of photochemical and can be used for rice milling industries. However, rice grain at dough stage had low milling quality due to high moisture content. Thermal processing can be applied to rice grain for improving milled rice yield. This experiment was conducted to study the chemical and physic properties of dough stage rice grain after roasting treatment. Rice were roasted with two different methods including traditional pan roasting at 140 °C for 60 minutes and using the electrical roasting machine at 140 °C for 30, 40, and 50 minutes. The chemical, physical properties, and bioactive compounds of brown rice and milled rice were evaluated. The result of this experiment showed that moisture content of brown and milled rice was less than 10 % and amylose contents were in the range of 26-28 %. Rice grains roasting for 30 min using electrical roasting machine had high head rice yield and length and breadth of grain after milling were close to traditional pan roasting (p > 0.05). The lightness (L*) of rice did not affect by roasting treatment (p > 0.05) and the a* indicated the yellowness of milled rice was lower than brown rice. The bioactive compounds of brown and milled rice significantly decreased with increasing of drying time. Brown rice roasted for 30 minutes had the highest of total phenolic content, antioxidant activity, α-tocopherol, and ɤ-oryzanol content. Volume expansion and elongation of cooked rice decreased as roasting time increased and quality of cooked rice roasted for 30 min was comparable to traditional pan roasting. Hardness of cooked rice as measured by texture analyzer increased with increasing roasting time. The results indicated that rice grains at dough stage, containing a high amount of bioactive compounds, have a great potential for rice milling industries and the electrical roasting machine can be used as an alternative to pan roasting which decreases processing time and labor costs.

Keywords: bioactive compounds, cooked rice, dough stage rice grain, grain development, roasting

Procedia PDF Downloads 164
11358 Media Regulation and Public Sphere in the Digital Age: An Analysis in the Light of Constructive Democracy

Authors: Carlos Marden Cabral Coutinho, Jose Luis Bolzan de Morais

Abstract:

The article proposed intends to analyze the possibility (and conditions) of a media regulation law in a democratic rule of law in the twenty-first century. To do so, will be presented initially the idea of the public sphere (by Jürgen Habermas), showing how it is presented as an interface between the citizen and the state (or the private and public) and how important is it in a deliberative democracy. Based on this paradigm, the traditional perception of the role of public information (such as system functional element) and on the possibility of media regulation will be exposed, due to the public nature of their activity. A critical argument will then be displayed from two different perspectives: a) the formal function of the current media information, considering that the digital age has fragmented the information access; b) the concept of a constructive democracy, which reduces the need for representation, changing the strategic importance of the public sphere. The question to be addressed (based on the comparative law) is if the regulation is justified in a polycentric democracy, especially when it operates under the digital age (with immediate and virtual communication). The proposal is to be presented in the sense that even in a twenty-first century the media in a democratic rule of law still has an extremely important role and may be subject to regulation, but this should be on terms very different (and narrower) from those usually defended.

Keywords: constructive democracy, media, digital age, public sphere

Procedia PDF Downloads 380
11357 Importance of an E-Learning Program in Stress Field for Postgraduate Courses of Doctors

Authors: Ramona-Niculina Jurcau, Ioana-Marieta Jurcau

Abstract:

Background: Preparing in the stress field (SF) is, increasingly, a concern for doctors of different specialties. Aims: The aim was to evaluate the importance of an e-learning program for doctors postgraduate courses, in SF. Methods: Doctors (n= 40 male, 40 female) of different specialties and ages (31-71 years), who attended postgraduate courses in SF, voluntarily responded to a questionnaire that included the following themes: Importance of SF courses for specialty practiced by each respondent doctor (using visual analogue scale, VAS); What SF themes would be indicated as e-learning (EL); Preferred form of SF information assimilation: Classical lectures (CL), EL or a combination of these methods (CL+EL); Which information on the SF course are facilitated by EL model versus CL; In their view which are the first four advantages and the first four disadvantages of EL compared to CL, for SF. Results: To most respondents, the SF courses are important for the specialty they practiced (VAS by an average of 4). The SF themes suggested to be done as EL were: Stress mechanisms; stress factor models for different medical specialties; stress assessment methods; primary stress management methods for different specialties. Preferred form of information assimilation was CL+EL. Aspects of the course facilitated by EL versus CL model: Active reading of theoretical information, with fast access to keywords details; watching documentaries in everyone's favorite order; practice through tests and the rapid control of results. The first four EL advantages, mentioned for SF were: Autonomy in managing the time allocated to the study; saving time for traveling to the venue; the ability to read information in various contexts of time and space; communication with colleagues, in good times for everyone. The first three EL disadvantages, mentioned for SF were: It decreases capabilities for group discussion and mobilization for active participation; EL information accession may depend on electrical source or/and Internet; learning slowdown can appear, by temptation of postponing the implementation. Answering questions was partially influenced by the respondent's age and genre. Conclusions: 1) Post-graduate courses in SF are of interest to doctors of different specialties. 2) The majority of participating doctors preferred EL, but combined with CL (CL+EL). 3) Preference for EL was manifested mainly by young or middle age men doctors. 4) It is important to balance the proper formula for chosen EL, to be the most efficient, interesting, useful and agreeable.

Keywords: stress field, doctors’ postgraduate courses, classical lectures, e-learning lecture

Procedia PDF Downloads 238
11356 A Qualitative Exploration of the Strategic Management of Employee Resistance to Organisational Change

Authors: Muneeb Banday, Anukriti Dixit

Abstract:

Change in organizations is viewed as a conversion process of the organizational functioning. One of the crucial elements of this conversion process is the employee resistance to organizational change. The existing literature on change resistance has generally treated resistance as a barrier or an opportunity for successful implementation of change. However, there is little empirical research exploring how resistance to change is managed. This may be partially due to difficulty in getting information on resistance to change. The top management does not divulge such information to avoid negative evaluation whereas employees face huge risk in sharing information related to resistance. The focus of the study is to understand how the organization under study dealt with the employee resistance to change. The conversion process is a story of how the organization went from one stage to another. We used narrative approach to change. Data was collected data through company visits and interviews. The interviews were transcribed, coded, and themes were identified. We focused on the strands that left huge scope for alternative interpretations than the dominant narrative of change prevalent in the organization. The study reveals that the top management strategically uses the legitimacy of leadership, roles of key employees, and rationality of change to manage resistance.

Keywords: employee resistance, legitimacy of leadership, narrative analysis, organisational change

Procedia PDF Downloads 275