Search results for: incidental information processing
12400 Microwave Sintering and Its Application on Cemented Carbides
Authors: Rumman M. D. Raihanuzzaman, Lee Chang Chuan, Zonghan Xie, Reza Ghomashchi
Abstract:
Cemented carbides, owing to their excellent mechanical properties, have been of immense interest in the field of hard materials for the past few decades. A number of processing techniques have been developed to obtain high quality carbide tools, with a wide range of grain size depending on the application and requirements. Microwave sintering is one of the heating processes, which has been used on a wide range of materials including ceramics. The complete understanding of microwave sintering and its contribution towards control of grain growth and on deformation of the resulting carbide materials needs further studies and attention. In addition, the effect of binder materials and their behaviour as a function of microwave sintering is another area that requires clear understanding. This review aims to focus on microwave sintering, providing information of how the process works and what type of materials it is best suited for. In addition, a closer look at some microwave sintered Tungsten Carbide-Cobalt samples will be taken and discussed, addressing some of the key issues and challenges faced in the research.Keywords: cemented carbides, consolidation, microwave sintering, mechanical properties
Procedia PDF Downloads 59712399 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation
Authors: C. Bunsanit
Abstract:
This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.Keywords: fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband
Procedia PDF Downloads 22612398 An Unsupervised Domain-Knowledge Discovery Framework for Fake News Detection
Authors: Yulan Wu
Abstract:
With the rapid development of social media, the issue of fake news has gained considerable prominence, drawing the attention of both the public and governments. The widespread dissemination of false information poses a tangible threat across multiple domains of society, including politics, economy, and health. However, much research has concentrated on supervised training models within specific domains, their effectiveness diminishes when applied to identify fake news across multiple domains. To solve this problem, some approaches based on domain labels have been proposed. By segmenting news to their specific area in advance, judges in the corresponding field may be more accurate on fake news. However, these approaches disregard the fact that news records can pertain to multiple domains, resulting in a significant loss of valuable information. In addition, the datasets used for training must all be domain-labeled, which creates unnecessary complexity. To solve these problems, an unsupervised domain knowledge discovery framework for fake news detection is proposed. Firstly, to effectively retain the multidomain knowledge of the text, a low-dimensional vector for each news text to capture domain embeddings is generated. Subsequently, a feature extraction module utilizing the unsupervisedly discovered domain embeddings is used to extract the comprehensive features of news. Finally, a classifier is employed to determine the authenticity of the news. To verify the proposed framework, a test is conducted on the existing widely used datasets, and the experimental results demonstrate that this method is able to improve the detection performance for fake news across multiple domains. Moreover, even in datasets that lack domain labels, this method can still effectively transfer domain knowledge, which can educe the time consumed by tagging without sacrificing the detection accuracy.Keywords: fake news, deep learning, natural language processing, multiple domains
Procedia PDF Downloads 9712397 A Critical Evaluation of Building Information Modelling in New Zealand: Deepening Our Understanding of the Benefits and Drawbacks
Authors: Garry Miller, Thomas Alexander, Cameron Lee
Abstract:
There is belief that Building Information Modelling (BIM) will improve performance of the New Zealand (NZ) Architecture, Engineering and Construction (AEC) sector, however, widespread use of BIM is yet to be seen. Previous research indicates there are many issues affecting the uptake of BIM in NZ; nevertheless the underlying benefits, drawbacks, and barriers preventing more widespread uptake are not fully understood. This investigation aimed to understand these factors more clearly and make suggestions on how to improve the uptake of BIM in NZ. Semi-structured interviews were conducted with a range of industry professionals to gather a qualitative understanding. Findings indicated the ability to incorporate better information into a BIM model could drive many benefits. However scepticism and lack of positive incentives in NZ are affecting its widespread use. This concluded that there is a need for the government to produce a number of BIM case studies and develop a set of BIM standards to resolve payment issues surrounding BIM use. This study provides useful information for those interested in BIM and members of government interested in improving the performance of the construction industry. This study may also be of interest to small, developed countries such as NZ where the level of BIM maturity is relatively low.Keywords: BIM, New Zealand, AEC sector, building information modelling
Procedia PDF Downloads 51812396 The Impact of Project-Based Learning under Representative Minorities Students
Authors: Shwadhin Sharma
Abstract:
As there has been increasing focus on the shorter attention span of the millennials students, there is a relative absence of instructional tools on behavioral assessments in learning information technology skills within the information systems field and textbooks. This study uses project-based learning in which students gain knowledge and skills related to information technology by working on an extended project that allows students to find a real business problem design information systems based on information collected from the company and develop an information system that solves the problem of the company. Eighty students from two sections of the same course engage in the project from the first week of the class till the sixteenth week of the class to deliver a small business information system that allows them to employ all the skills and knowledge that they learned in the class into the systems they are creating. Computer Information Systems related courses are often difficult to understand and process especially for the Under Representative Minorities students who have limited computer or information systems related (academic) experiences. Project-based learning demands constant attention of the students and forces them to apply knowledge learned in the class to a project that helps retaining knowledge. To make sure our assumption is correct, we started with a pre-test and post-test to test the students learning (of skills) based on the project. Our test showed that almost 90% of the students from the two sections scored higher in post-test as compared to pre-test. Based on this premise, we conducted a further survey that measured student’s job-search preparation, knowledge of data analysis, involved with the course, satisfaction with the course, student’s overall reaction the course and students' ability to meet the traditional learning goals related to the course.Keywords: project-based learning, job-search preparation, satisfaction with course, traditional learning goals
Procedia PDF Downloads 20612395 Pilot-free Image Transmission System of Joint Source Channel Based on Multi-Level Semantic Information
Authors: Linyu Wang, Liguo Qiao, Jianhong Xiang, Hao Xu
Abstract:
In semantic communication, the existing joint Source Channel coding (JSCC) wireless communication system without pilot has unstable transmission performance and can not effectively capture the global information and location information of images. In this paper, a pilot-free image transmission system of joint source channel based on multi-level semantic information (Multi-level JSCC) is proposed. The transmitter of the system is composed of two networks. The feature extraction network is used to extract the high-level semantic features of the image, compress the information transmitted by the image, and improve the bandwidth utilization. Feature retention network is used to preserve low-level semantic features and image details to improve communication quality. The receiver also is composed of two networks. The received high-level semantic features are fused with the low-level semantic features after feature enhancement network in the same dimension, and then the image dimension is restored through feature recovery network, and the image location information is effectively used for image reconstruction. This paper verifies that the proposed multi-level JSCC algorithm can effectively transmit and recover image information in both AWGN channel and Rayleigh fading channel, and the peak signal-to-noise ratio (PSNR) is improved by 1~2dB compared with other algorithms under the same simulation conditions.Keywords: deep learning, JSCC, pilot-free picture transmission, multilevel semantic information, robustness
Procedia PDF Downloads 12012394 Effect of Some Metal Ions on the Activity of Lipase Produced by Aspergillus Niger Cultured on Vitellaria Paradoxa Shells
Authors: Abdulhakeem Sulyman, Olukotun Zainab, Hammed Abdulquadri
Abstract:
Lipases (triacylglycerol acyl hydrolases) (EC 3.1.1.3) are class of enzymes that catalyses the hydrolysis of triglycerides to glycerol and free fatty acids. They account for up to 10% of the enzyme in the market and have a wide range of applications in biofuel production, detergent formulation, leather processing and in food and feed processing industry. This research was conducted to study the effect of some metal ions on the activity of purified lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells. Purified lipase in 12.5 mM p-NPL was incubated with different metal ions (Zn²⁺, Ca²⁺, Mn²⁺, Fe²⁺, Na⁺, K⁺ and Mg²⁺). The final concentrations of metal ions investigated were 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 and 1.0 mM. The results obtained from the study showed that Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ ions increased the activity of lipase up to 3.0, 3.0, 1.0, and 26.0 folds respectively. Lipase activity was partially inhibited by Na⁺ and Mg²⁺ with up to 88.5% and 83.7% loss of activity respectively. Lipase activity was also inhibited by K⁺ with up to 56.7% loss in the activity as compared to in the absence of metal ions. The study concluded that lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells can be activated by the presence of Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ and inhibited by Na⁺, K⁺ and Mg²⁺.Keywords: Aspergillus niger, Vitellaria paradoxa, lipase, metal ions
Procedia PDF Downloads 15012393 Online Monitoring Rheological Property of Polymer Melt during Injection Molding
Authors: Chung-Chih Lin, Chien-Liang Wu
Abstract:
The detection of the polymer melt state during manufacture process is regarded as an efficient way to control the molded part quality in advance. Online monitoring rheological property of polymer melt during processing procedure provides an approach to understand the melt state immediately. Rheological property reflects the polymer melt state at different processing parameters and is very important in injection molding process especially. An approach that demonstrates how to calculate rheological property of polymer melt through in-process measurement, using injection molding as an example, is proposed in this study. The system consists of two sensors and a data acquisition module can process the measured data, which are used for the calculation of rheological properties of polymer melt. The rheological properties of polymer melt discussed in this study include shear rate and viscosity which are investigated with respect to injection speed and melt temperature. The results show that the effect of injection speed on the rheological properties is apparent, especially for high melt temperature and should be considered for precision molding process.Keywords: injection molding, melt viscosity, shear rate, monitoring
Procedia PDF Downloads 38212392 A Comprehensive Metamodel of an Urbanized Information System: Experimental Case
Authors: Leila Trabelsi
Abstract:
The urbanization of Information Systems (IS) is an effective approach to master the complexity of the organization. It strengthens the coherence of IS and aligns it with the business strategy. Moreover, this approach has significant advantages such as reducing Information Technologies (IT) costs, enhancing the IS position in a competitive environment and ensuring the scalability of the IS through the integration of technological innovations. Therefore, the urbanization is considered as a business strategic decision. Thus, its embedding becomes a necessity in order to improve the IS practice. However, there is a lack of experimental cases studying meta-modelling of Urbanized Information System (UIS). The aim of this paper addresses new urbanization content meta-model which permits modelling, testing and taking into consideration organizational aspects. This methodological framework is structured according to two main abstraction levels, a conceptual level and an operational level. For each of these levels, different models are proposed and presented. The proposed model for has been empirically tested on company. The findings of this paper present an experimental study of urbanization meta-model. The paper points out the significant relationships between dimensions and their evolution.Keywords: urbanization, information systems, enterprise architecture, meta-model
Procedia PDF Downloads 43812391 Photogrammetry and Topographic Information for Urban Growth and Change in Amman
Authors: Mahmoud M. S. Albattah
Abstract:
Urbanization results in the expansion of administrative boundaries, mainly at the periphery, ultimately leading to changes in landcover. Agricultural land, naturally vegetated land, and other land types are converted into residential areas with a high density of constructs, such as transportation systems and housing. In urban regions of rapid growth and change, urban planners need regular information on up to date ground change. Amman (the capital of Jordan) is growing at unprecedented rates, creating extensive urban landscapes. Planners interact with these changes without having a global view of their impact. The use of aerial photographs and satellite images data combined with topographic information and field survey could provide effective information to develop urban change and growth inventory which could be explored towards producing a very important signature for the built-up area changes.Keywords: highway design, satellite technologies, remote sensing, GIS, image segmentation, classification
Procedia PDF Downloads 44412390 Detection of Image Blur and Its Restoration for Image Enhancement
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.Keywords: image enhancement, motion analysis, motion detection, motion estimation
Procedia PDF Downloads 28812389 Adopting Collaborative Business Processes to Prevent the Loss of Information in Public Administration Organisations
Authors: A. Capodieci, G. Del Fiore, L. Mainetti
Abstract:
Recently, the use of web 2.0 tools has increased in companies and public administration organizations. This phenomenon, known as "Enterprise 2.0", has, de facto, modified common organizational and operative practices. This has led “knowledge workers” to change their working practices through the use of Web 2.0 communication tools. Unfortunately, these tools have not been integrated with existing enterprise information systems, a situation that could potentially lead to a loss of information. This is an important problem in an organizational context, because knowledge of information exchanged within the organization is needed to increase the efficiency and competitiveness of the organization. In this article we demonstrate that it is possible to capture this knowledge using collaboration processes, which are processes of abstraction created in accordance with design patterns and applied to new organizational operative practices.Keywords: business practices, business process patterns, collaboration tools, enterprise 2.0, knowledge workers
Procedia PDF Downloads 35912388 Efficient of Technology Remediation Soil That Contaminated by Petroleum Based on Heat without Combustion
Authors: Gavin Hutama Farandiarta, Hegi Adi Prabowo, Istiara Rizqillah Hanifah, Millati Hanifah Saprudin, Raden Iqrafia Ashna
Abstract:
The increase of the petroleum’s consumption rate encourages industries to optimize and increase the activity in processing crude oil into petroleum. However, although the result gives a lot of benefits to humans worldwide, it also gives negative impact to the environment. One of the negative impacts of processing crude oil is the soil will be contaminated by petroleum sewage sludge. This petroleum sewage sludge, contains hydrocarbon compound and it can be calculated by Total Petroleum Hydrocarbon (TPH).Petroleum sludge waste is accounted as hazardous and toxic. The soil contamination caused by the petroleum sludge is very hard to get rid of. However, there is a way to manage the soil that is contaminated by petroleum sludge, which is by using heat (thermal desorption) in the process of remediation. There are several factors that affect the success rate of the remediation with the help of heat which are temperature, time, and air pressure in the desorption column. The remediation process using the help of heat is an alternative in soil recovery from the petroleum pollution which highly effective, cheap, and environmentally friendly that produces uncontaminated soil and the petroleum that can be used again.Keywords: petroleum sewage sludge, remediation soil, thermal desorption, total petroleum hydrocarbon (TPH)
Procedia PDF Downloads 24712387 A Comprehensive Study of Camouflaged Object Detection Using Deep Learning
Authors: Khalak Bin Khair, Saqib Jahir, Mohammed Ibrahim, Fahad Bin, Debajyoti Karmaker
Abstract:
Object detection is a computer technology that deals with searching through digital images and videos for occurrences of semantic elements of a particular class. It is associated with image processing and computer vision. On top of object detection, we detect camouflage objects within an image using Deep Learning techniques. Deep learning may be a subset of machine learning that's essentially a three-layer neural network Over 6500 images that possess camouflage properties are gathered from various internet sources and divided into 4 categories to compare the result. Those images are labeled and then trained and tested using vgg16 architecture on the jupyter notebook using the TensorFlow platform. The architecture is further customized using Transfer Learning. Methods for transferring information from one or more of these source tasks to increase learning in a related target task are created through transfer learning. The purpose of this transfer of learning methodologies is to aid in the evolution of machine learning to the point where it is as efficient as human learning.Keywords: deep learning, transfer learning, TensorFlow, camouflage, object detection, architecture, accuracy, model, VGG16
Procedia PDF Downloads 14912386 Tsada-MobiMinder: A Location Based Alarm Mobile Reminder
Authors: Marylene S. Eder
Abstract:
Existing location based alarm applications has inability to give information to user’s particular direction to a specified place of destination and does not display a particular scenic spot from its current location going to the destination. With this problem, a location based alarm mobile reminder was developed. The application is implemented on Android based smart phones to provide services like providing routing information, helping to find nearby hotels, restaurants and scenic spots and offer many advantages to the mobile users to retrieve the information about their current location and process that data to get more useful information near to their location. It reminds the user about the location when the user enters some predefined location. All the user needs to have is the mobile phone with android platform with version 4.0 and above, and then the user can select the destination and find the destination on the application. The main objective of the project is to develop a location based application that provides tourists with real time information for scenic spots and provides alarm to a specified place of destination. This mobile application service will act as assistance for the frequent travelers to visit new places around the City.Keywords: location based alarm, mobile application, mobile reminder, tourist’s spots
Procedia PDF Downloads 38212385 Analysis of Secondary School Students' Perceptions about Information Technologies through a Word Association Test
Authors: Fetah Eren, Ismail Sahin, Ismail Celik, Ahmet Oguz Akturk
Abstract:
The aim of this study is to discover secondary school students’ perceptions related to information technologies and the connections between concepts in their cognitive structures. A word association test consisting of six concepts related to information technologies is used to collect data from 244 secondary school students. Concept maps that present students’ cognitive structures are drawn with the help of frequency data. Data are analyzed and interpreted according to the connections obtained as a result of the concept maps. It is determined students associate most with these concepts—computer, Internet, and communication of the given concepts, and associate least with these concepts—computer-assisted education and information technologies. These results show the concepts, Internet, communication, and computer, are an important part of students’ cognitive structures. In addition, students mostly answer computer, phone, game, Internet and Facebook as the key concepts. These answers show students regard information technologies as a means for entertainment and free time activity, not as a means for education.Keywords: word association test, cognitive structure, information technology, secondary school
Procedia PDF Downloads 41312384 Deep Learning Based Road Crack Detection on an Embedded Platform
Authors: Nurhak Altın, Ayhan Kucukmanisa, Oguzhan Urhan
Abstract:
It is important that highways are in good condition for traffic safety. Road crashes (road cracks, erosion of lane markings, etc.) can cause accidents by affecting driving. Image processing based methods for detecting road cracks are available in the literature. In this paper, a deep learning based road crack detection approach is proposed. YOLO (You Look Only Once) is adopted as core component of the road crack detection approach presented. The YOLO network structure, which is developed for object detection, is trained with road crack images as a new class that is not previously used in YOLO. The performance of the proposed method is compared using different training methods: using randomly generated weights and training their own pre-trained weights (transfer learning). A similar training approach is applied to the simplified version of the YOLO network model (tiny yolo) and the results of the performance are examined. The developed system is able to process 8 fps on NVIDIA Jetson TX1 development kit.Keywords: deep learning, embedded platform, real-time processing, road crack detection
Procedia PDF Downloads 33912383 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act
Authors: Maria Jędrzejczak, Patryk Pieniążek
Abstract:
The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.Keywords: data protection law, personal data, AI law, personal data breach
Procedia PDF Downloads 6512382 Software-Defined Networks in Utility Power Networks
Authors: Ava Salmanpour, Hanieh Saeedi, Payam Rouhi, Elahe Hamzeil, Shima Alimohammadi, Siamak Hossein Khalaj, Mohammad Asadian
Abstract:
Software-defined network (SDN) is a network architecture designed to control network using software application in a central manner. This ability enables remote control of the whole network regardless of the network technology. In fact, in this architecture network intelligence is separated from physical infrastructure, it means that required network components can be implemented virtually using software applications. Today, power networks are characterized by a high range of complexity with a large number of intelligent devices, processing both huge amounts of data and important information. Therefore, reliable and secure communication networks are required. SDNs are the best choice to meet this issue. In this paper, SDN networks capabilities and characteristics will be reviewed and different basic controllers will be compared. The importance of using SDNs to escalate efficiency and reliability in utility power networks is going to be discussed and the comparison between the SDN-based power networks and traditional networks will be explained.Keywords: software-defined network, SDNs, utility network, open flow, communication, gas and electricity, controller
Procedia PDF Downloads 11312381 Gender Analysis of the Influence of Sources of Information on the Adoption of Tenera Oil Palm Technology among Smallholder Farmers in Edo State, Nigeria
Authors: Cornelius Michael Ekenta
Abstract:
The research made a gender comparative analysis of the influence of sources of information on the adoption of tenera improved oil palm technology. Purposive, stratified and random sampling techniques were used to sample a total of 292 farmers (155 males and 137 females) for the study. Structured questionnaire was used to obtain primary data used for analysis. Obtained data were analyzed with descriptive statistics and Logit regressions analysis. Findings revealed that radio, extension office, television and farmers’ group were the most preferred sources of information by the farmers both male and female. Also, males perceived information from radio (92%) and farmers’ group (84%) to be available and information from Research Institutes as credible (95%). Similarly, the female perceived information from Research Institutes to be reliable (70%). The study showed that 38% of men adopted the variety, 25% of the women adopted the variety while 32% of both men and women adopted the variety in the study area. Regressions analysis indicated that radio, extension office, television, farmers’ group and research institute were significant at 0.5% of probability for men and female farmers. The study concluded that the adoption of tenera improved oil palm technology was low among male and female farmers though men adopted more than the women. It was recommended therefore that Agricultural Development Programme (ADP) in other states of the country should partner with their state radio and television stations to broadcast agricultural programmes periodically to ensure efficient dissemination of agricultural information to the farmers.Keywords: analysis, Edo, gender, influence, information, sources, tenera
Procedia PDF Downloads 11212380 An ERP Study of Chinese Pseudo-Object Structures
Authors: Changyin Zhou
Abstract:
Verb-argument relation is a very important aspect of syntax-semantics interaction in sentence processing. Previous ERP (event related potentials) studies in this field mainly concentrated on the relation between the verb and its core arguments. The present study aims to reveal the ERP pattern of Chinese pseudo-object structures (SOSs), in which a peripheral argument is promoted to occupy the position of the patient object, as compared with the patient object structures (POSs). The ERP data were collected when participants were asked to perform acceptability judgments about Chinese phrases. Our result shows that, similar to the previous studies of number-of-argument violations, Chinese SOSs show a bilaterally distributed N400 effect. But different from all the previous studies of verb-argument relations, Chinese SOSs demonstrate a sustained anterior positivity (SAP). This SAP, which is the first report related to complexity of argument structure operation, reflects the integration difficulty of the newly promoted arguments and the progressive nature of well-formedness checking in the processing of Chinese SOSs.Keywords: Chinese pseudo-object structures, ERP, sustained anterior positivity, verb-argument relation
Procedia PDF Downloads 43412379 Personalization of Context Information Retrieval Model via User Search Behaviours for Ranking Document Relevance
Authors: Kehinde Agbele, Longe Olumide, Daniel Ekong, Dele Seluwa, Akintoye Onamade
Abstract:
One major problem of most existing information retrieval systems (IRS) is that they provide even access and retrieval results to individual users specially based on the query terms user issued to the system. When using IRS, users often present search queries made of ad-hoc keywords. It is then up to IRS to obtain a precise representation of user’s information need, and the context of the information. In effect, the volume and range of the Internet documents is growing exponentially and consequently causes difficulties for a user to obtain information that precisely matches the user interest. Diverse combination techniques are used to achieve the specific goal. This is due, firstly, to the fact that users often do not present queries to IRS that optimally represent the information they want, and secondly, the measure of a document's relevance is highly subjective between diverse users. In this paper, we address the problem by investigating the optimization of IRS to individual information needs in order of relevance. The paper addressed the development of algorithms that optimize the ranking of documents retrieved from IRS. This paper addresses this problem with a two-fold approach in order to retrieve domain-specific documents. Firstly, the design of context of information. The context of a query determines retrieved information relevance using personalization and context-awareness. Thus, executing the same query in diverse contexts often leads to diverse result rankings based on the user preferences. Secondly, the relevant context aspects should be incorporated in a way that supports the knowledge domain representing users’ interests. In this paper, the use of evolutionary algorithms is incorporated to improve the effectiveness of IRS. A context-based information retrieval system that learns individual needs from user-provided relevance feedback is developed whose retrieval effectiveness is evaluated using precision and recall metrics. The results demonstrate how to use attributes from user interaction behavior to improve the IR effectiveness.Keywords: context, document relevance, information retrieval, personalization, user search behaviors
Procedia PDF Downloads 46312378 Re-Defining Academic Literacy: An Information Literacy Approach to Helping Chinese International Students Succeed in American Colleges
Authors: Yi Ding
Abstract:
With the upsurge of Chinese international students in American higher education, serious academic problems Chinese international students are suffering from are also striking. While most practices and research in higher education focus on the role of professors, writing centers, and tutoring centers to help international students succeed in college, this research study focuses on a more fundamental skill that is neglected in most conversations: information literacy, which is usually addressed by academic librarians. Transitioning from an East-Asian, developing educational system that values authority, set knowledge more than independent thinking, scholarly conversation, Chinese international students need support from academic librarians to acquire information literacy, which is crucial to understand expectations of a Western academic setting and thus to succeed in college. This research study illustrates how academic librarians can play an integral role in helping Chinese international students acclimate to the expectations of American higher education by teaching information literacy as academic literacy unique to the Western academic setting. Six keys of information literacy put forward by Association of College and Research Libraries, which are 'Authority Is Constructed and Contextual', 'Information Creation as a Process', 'Information Has Value', 'Research as Inquiry', 'Scholarship as Conversation', and 'Searching as Strategic Exploration', are analyzed through the lens of Chinese educational system and students’ backgrounds. Based on the analysis as well as results from surveys and interviews among academic librarians, professors, and international students, this research further examines current practices from a wide range of academic libraries and finally, provides evidence-based recommendations for academic librarians to use information literacy instruction to help Chinese international students succeed in American higher education.Keywords: academic librarians, Chinese international students, information literacy, student success
Procedia PDF Downloads 24412377 Broadband Ultrasonic and Rheological Characterization of Liquids Using Longitudinal Waves
Authors: M. Abderrahmane Mograne, Didier Laux, Jean-Yves Ferrandis
Abstract:
Rheological characterizations of complex liquids like polymer solutions present an important scientific interest for a lot of researchers in many fields as biology, food industry, chemistry. In order to establish master curves (elastic moduli vs frequency) which can give information about microstructure, classical rheometers or viscometers (such as Couette systems) are used. For broadband characterization of the sample, temperature is modified in a very large range leading to equivalent frequency modifications applying the Time Temperature Superposition principle. For many liquids undergoing phase transitions, this approach is not applicable. That is the reason, why the development of broadband spectroscopic methods around room temperature becomes a major concern. In literature many solutions have been proposed but, to our knowledge, there is no experimental bench giving the whole rheological characterization for frequencies about a few Hz (Hertz) to many MHz (Mega Hertz). Consequently, our goal is to investigate in a nondestructive way in very broadband frequency (A few Hz – Hundreds of MHz) rheological properties using longitudinal ultrasonic waves (L waves), a unique experimental bench and a specific container for the liquid: a test tube. More specifically, we aim to estimate the three viscosities (longitudinal, shear and bulk) and the complex elastic moduli (M*, G* and K*) respectively longitudinal, shear and bulk moduli. We have decided to use only L waves conditioned in two ways: bulk L wave in the liquid or guided L waves in the tube test walls. In this paper, we will present first results for very low frequencies using the ultrasonic tracking of a falling ball in the test tube. This will lead to the estimation of shear viscosity from a few mPa.s to a few Pa.s (Pascal second). Corrections due to the small dimensions of the tube will be applied and discussed regarding the size of the falling ball. Then the use of bulk L wave’s propagation in the liquid and the development of a specific signal processing in order to assess longitudinal velocity and attenuation will conduct to the longitudinal viscosity evaluation in the MHz frequency range. At last, the first results concerning the propagation, the generation and the processing of guided compressional waves in the test tube walls will be discussed. All these approaches and results will be compared to standard methods available and already validated in our lab.Keywords: nondestructive measurement for liquid, piezoelectric transducer, ultrasonic longitudinal waves, viscosities
Procedia PDF Downloads 26512376 Thermo-Mechanical Processing Scheme to Obtain Micro-Duplex Structure Favoring Superplasticity in an As-Cast and Homogenized Medium Alloyed Nickel Base Superalloy
Authors: K. Sahithya, I. Balasundar, Pritapant, T. Raghua
Abstract:
Ni-based superalloy with a nominal composition Ni-14% Cr-11% Co-5.8% Mo-2.4% Ti-2.4% Nb-2.8% Al-0.26 % Fe-0.032% Si-0.069% C (all in wt %) is used as turbine discs in a variety of aero engines. Like any other superalloy, the primary processing of the as-cast superalloy poses a major challenge due to its complex alloy chemistry. The challenge was circumvented by characterizing the different phases present in the material, optimizing the homogenization treatment, identifying a suitable thermomechanical processing window using dynamic materials modeling. The as-cast material was subjected to homogenization at 1200°C for a soaking period of 8 hours and quenched using different media. Water quenching (WQ) after homogenization resulted in very fine spherical γꞌ precipitates of sizes 30-50 nm, whereas furnace cooling (FC) after homogenization resulted in bimodal distribution of precipitates (primary gamma prime of size 300nm and secondary gamma prime of size 5-10 nm). MC type primary carbides that are stable till the melting point of the material were found in both WQ and FC samples. Deformation behaviour of both the materials below (1000-1100°C) and above gamma prime solvus (1100-1175°C) was evaluated by subjecting the material to series of compression tests at different constant true strain rates (0.0001/sec-1/sec). An in-detail examination of the precipitate dislocation interaction mechanisms carried out using TEM revealed precipitate shearing and Orowan looping as the mechanisms governing deformation in WQ and FC, respectively. Incoherent/semi coherent gamma prime precipitates in the case of FC material facilitates better workability of the material, whereas the coherent precipitates in WQ material contributed to higher resistance to deformation of the material. Both the materials exhibited discontinuous dynamic recrystallization (DDRX) above gamma prime solvus temperature. The recrystallization kinetics was slower in the case of WQ material. Very fine grain boundary carbides ( ≤ 300 nm) retarded the recrystallisation kinetics in WQ. Coarse carbides (1-5 µm) facilitate particle stimulated nucleation in FC material. The FC material was cogged (primary hot working) 1120˚C, 0.03/sec resulting in significant grain refinement, i.e., from 3000 μm to 100 μm. The primary processed material was subjected to intensive thermomechanical deformation subsequently by reducing the temperature by 50˚C in each processing step with intermittent heterogenization treatment at selected temperatures aimed at simultaneous coarsening of the gamma prime precipitates and refinement of the gamma matrix grains. The heterogeneous annealing treatment carried out, resulted in gamma grains of 10 μm and gamma prime precipitates of 1-2 μm. Further thermo mechanical processing of the material was carried out at 1025˚C to increase the homogeneity of the obtained micro-duplex structure.Keywords: superalloys, dynamic material modeling, nickel alloys, dynamic recrystallization, superplasticity
Procedia PDF Downloads 12112375 Functional Neural Network for Decision Processing: A Racing Network of Programmable Neurons Where the Operating Model Is the Network Itself
Authors: Frederic Jumelle, Kelvin So, Didan Deng
Abstract:
In this paper, we are introducing a model of artificial general intelligence (AGI), the functional neural network (FNN), for modeling human decision-making processes. The FNN is composed of multiple artificial mirror neurons (AMN) racing in the network. Each AMN has a similar structure programmed independently by the users and composed of an intention wheel, a motor core, and a sensory core racing at a specific velocity. The mathematics of the node’s formulation and the racing mechanism of multiple nodes in the network will be discussed, and the group decision process with fuzzy logic and the transformation of these conceptual methods into practical methods of simulation and in operations will be developed. Eventually, we will describe some possible future research directions in the fields of finance, education, and medicine, including the opportunity to design an intelligent learning agent with application in AGI. We believe that FNN has a promising potential to transform the way we can compute decision-making and lead to a new generation of AI chips for seamless human-machine interactions (HMI).Keywords: neural computing, human machine interation, artificial general intelligence, decision processing
Procedia PDF Downloads 12512374 Information Communication Technology (ICT) Using Management in Nursing College under the Praboromarajchanok Institute
Authors: Suphaphon Udomluck, Pannathorn Chachvarat
Abstract:
Information Communication Technology (ICT) using management is essential for effective decision making in organization. The Concerns Based Adoption Model (CBAM) was employed as the conceptual framework. The purposes of the study were to assess the situation of Information Communication Technology (ICT) using management in College of Nursing under the Praboromarajchanok Institute. The samples were multi – stage sampling of 10 colleges of nursing that participated include directors, vice directors, head of learning groups, teachers, system administrator and responsible for ICT. The total participants were 280; the instrument used were questionnaires that include 4 parts, general information, Information Communication Technology (ICT) using management, the Stage of concern Questionnaires (SoC), and the Levels of Use (LoU) ICT Questionnaires respectively. Reliability coefficients were tested; alpha coefficients were 0.967for Information Communication Technology (ICT) using management, 0.884 for SoC and 0.945 for LoU. The data were analyzed by frequency, percentage, mean, standard deviation, Pearson Product Moment Correlation and Multiple Regression. They were founded as follows: The high level overall score of Information Communication Technology (ICT) using management and issue were administration, hardware, software, and people. The overall score of the Stage of concern (SoC)ICTis at high level and the overall score of the Levels of Use (LoU) ICTis at moderate. The Information Communication Technology (ICT) using management had the positive relationship with the Stage of concern (SoC)ICTand the Levels of Use (LoU) ICT(p < .01). The results of Multiple Regression revealed that administration hardwear, software and people ware could predict SoC of ICT (18.5%) and LoU of ICT (20.8%).The factors that were significantly influenced by SoCs were people ware. The factors that were significantly influenced by LoU of ICT were administration hardware and people ware.Keywords: information communication technology (ICT), management, the concerns-based adoption model (CBAM), stage of concern(SoC), the levels of use(LoU)
Procedia PDF Downloads 31812373 Artificial Intelligence for Safety Related Aviation Incident and Accident Investigation Scenarios
Authors: Bernabeo R. Alberto
Abstract:
With the tremendous improvements in the processing power of computers, the possibilities of artificial intelligence will increasingly be used in aviation and make autonomous flights, preventive maintenance, ATM (Air Traffic Management) optimization, pilots, cabin crew, ground staff, and airport staff training possible in a cost-saving, less time-consuming and less polluting way. Through the use of artificial intelligence, we foresee an interviewing scenario where the interviewee will interact with the artificial intelligence tool to contextualize the character and the necessary information in a way that aligns reasonably with the character and the scenario. We are creating simulated scenarios connected with either an aviation incident or accident to enhance also the training of future accident/incident investigators integrating artificial intelligence and augmented reality tools. The project's goal is to improve the learning and teaching scenario through academic and professional expertise in aviation and in the artificial intelligence field. Thus, we intend to contribute to the needed high innovation capacity, skills, and training development and management of artificial intelligence, supported by appropriate regulations and attention to ethical problems.Keywords: artificial intelligence, aviation accident, aviation incident, risk, safety
Procedia PDF Downloads 2212372 The Role of Social Media on Political Behaviour in Malaysia
Authors: Ismail Sualman, Mohd Khairuddin Othman
Abstract:
General Election has been the backbone of democracy that permits people to choose their representatives as they deem fit. The support preferences of the voter differ from one to another, particularly in a plural society like Malaysia. The turning up of high numbers of young voters during the Malaysia 14th General Election has been said to have been caused by social media including Facebook, Twitter, WhatsApp, Instagram, YouTube and Telegram, WeChat and SMS/MMs. It has been observed that, besides using social media as an interaction tool among social friends, it is also an important source of information to know about issues, politics and politicians. This paper exhibits the role of social media in providing political information to young voters, before an election and during the election campaign. This study examines how this information is being translated into election support. A total of 799 Malay young respondents in Selangor have been surveyed and interviewed. This study revealed that social media has become the source of political information among Malay young voters. This research suggested that social media had a significant effect on the support during the election. Social media plays an important role in carrying information such as current issues, voting trends, candidate imagery and matters that may influence the view of young voters. The information obtained from social media has been translated into a voting decision.Keywords: social media, political behaviour, voters’ choice, election.
Procedia PDF Downloads 14612371 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM
Procedia PDF Downloads 112