Search results for: computer generated music
3928 An Analytical Approach of Computational Complexity for the Method of Multifluid Modelling
Authors: A. K. Borah, A. K. Singh
Abstract:
In this paper we deal building blocks of the computer simulation of the multiphase flows. Whole simulation procedure can be viewed as two super procedures; The implementation of VOF method and the solution of Navier Stoke’s Equation. Moreover, a sequential code for a Navier Stoke’s solver has been studied.Keywords: Bi-conjugate gradient stabilized (Bi-CGSTAB), ILUT function, krylov subspace, multifluid flows preconditioner, simple algorithm
Procedia PDF Downloads 5283927 A Clustering-Based Approach for Weblog Data Cleaning
Authors: Amine Ganibardi, Cherif Arab Ali
Abstract:
This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data
Procedia PDF Downloads 1703926 A High-Throughput Enzyme Screening Method Using Broadband Coherent Anti-stokes Raman Spectroscopy
Authors: Ruolan Zhang, Ryo Imai, Naoko Senda, Tomoyuki Sakai
Abstract:
Enzymes have attracted increasing attentions in industrial manufacturing for their applicability in catalyzing complex chemical reactions under mild conditions. Directed evolution has become a powerful approach to optimize enzymes and exploit their full potentials under the circumstance of insufficient structure-function knowledge. With the incorporation of cell-free synthetic biotechnology, rapid enzyme synthesis can be realized because no cloning procedure such as transfection is needed. Its open environment also enables direct enzyme measurement. These properties of cell-free biotechnology lead to excellent throughput of enzymes generation. However, the capabilities of current screening methods have limitations. Fluorescence-based assay needs applicable fluorescent label, and the reliability of acquired enzymatic activity is influenced by fluorescent label’s binding affinity and photostability. To acquire the natural activity of an enzyme, another method is to combine pre-screening step and high-performance liquid chromatography (HPLC) measurement. But its throughput is limited by necessary time investment. Hundreds of variants are selected from libraries, and their enzymatic activities are then identified one by one by HPLC. The turn-around-time is 30 minutes for one sample by HPLC, which limits the acquirable enzyme improvement within reasonable time. To achieve the real high-throughput enzyme screening, i.e., obtain reliable enzyme improvement within reasonable time, a widely applicable high-throughput measurement of enzymatic reactions is highly demanded. Here, a high-throughput screening method using broadband coherent anti-Stokes Raman spectroscopy (CARS) was proposed. CARS is one of coherent Raman spectroscopy, which can identify label-free chemical components specifically from their inherent molecular vibration. These characteristic vibrational signals are generated from different vibrational modes of chemical bonds. With the broadband CARS, chemicals in one sample can be identified from their signals in one broadband CARS spectrum. Moreover, it can magnify the signal levels to several orders of magnitude greater than spontaneous Raman systems, and therefore has the potential to evaluate chemical's concentration rapidly. As a demonstration of screening with CARS, alcohol dehydrogenase, which converts ethanol and nicotinamide adenine dinucleotide oxidized form (NAD+) to acetaldehyde and nicotinamide adenine dinucleotide reduced form (NADH), was used. The signal of NADH at 1660 cm⁻¹, which is generated from nicotinamide in NADH, was utilized to measure the concentration of it. The evaluation time for CARS signal of NADH was determined to be as short as 0.33 seconds while having a system sensitivity of 2.5 mM. The time course of alcohol dehydrogenase reaction was successfully measured from increasing signal intensity of NADH. This measurement result of CARS was consistent with the result of a conventional method, UV-Vis. CARS is expected to have application in high-throughput enzyme screening and realize more reliable enzyme improvement within reasonable time.Keywords: Coherent Anti-Stokes Raman Spectroscopy, CARS, directed evolution, enzyme screening, Raman spectroscopy
Procedia PDF Downloads 1413925 Quality of Today's Teachers: Post-Certified Teachers' Competence in Alleviating Poverties towards a Sustainable Development
Authors: Sudirman
Abstract:
Competence is a term describing capability that correlates with a person’s occupation. The competence of a teacher consists of four, i.e., pedagogical, professional, personality and social competence. These four components are implemented during interacting with students to motivate the students and improve their achievement. The objective of this qualitative study is to explore the roles and contributions of certified teachers in alleviating the issue of poverty to promote a sustainable development. The data comprise primary and secondary data which were generated from observation, interview, documentation and library research. Furthermore, this study offers in-depth information regarding the performance of the teachers in coping with poverty and sustaining development. The result shows that the teacher’s competence positively contributes to the improvement of students’ achievement. This helps the students to prepare for the real work experience by which it results in a better income and, therefore, alleviate poverty. All in all, the quality of today’s teachers can be measured by their contribution in enhancing the students’ competence prior to entering real work, resulting in a wealthy society. This is to deal with poverty and conceptualizing a sustainable development.Keywords: competence, development, poverty, teachers
Procedia PDF Downloads 1513924 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms
Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang
Abstract:
Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.Keywords: bioassay, machine learning, preprocessing, virtual screen
Procedia PDF Downloads 2743923 Political Communication in Twitter Interactions between Government, News Media and Citizens in Mexico
Authors: Jorge Cortés, Alejandra Martínez, Carlos Pérez, Anaid Simón
Abstract:
The presence of government, news media, and general citizenry in social media allows considering interactions between them as a form of political communication (i.e. the public exchange of contradictory discourses about politics). Twitter’s asymmetrical following model (users can follow, mention or reply to other users that do not follow them) could foster alternative democratic practices and have an impact on Mexican political culture, which has been marked by a lack of direct communication channels between these actors. The research aim is to assess Twitter’s role in political communication practices through the analysis of interaction dynamics between government, news media, and citizens by extracting and visualizing data from Twitter’s API to observe general behavior patterns. The hypothesis is that regardless the fact that Twitter’s features enable direct and horizontal interactions between actors, users repeat traditional dynamics of interaction, without taking full advantage of the possibilities of this medium. Through an interdisciplinary team including Communication Strategies, Information Design, and Interaction Systems, the activity on Twitter generated by the controversy over the presence of Uber in Mexico City was analysed; an issue of public interest, involving aspects such as public opinion, economic interests and a legal dimension. This research includes techniques from social network analysis (SNA), a methodological approach focused on the comprehension of the relationships between actors through the visual representation and measurement of network characteristics. The analysis of the Uber event comprised data extraction, data categorization, corpus construction, corpus visualization and analysis. On the recovery stage TAGS, a Google Sheet template, was used to extract tweets that included the hashtags #UberSeQueda and #UberSeVa, posts containing the string Uber and tweets directed to @uber_mx. Using scripts written in Python, the data was filtered, discarding tweets with no interaction (replies, retweets or mentions) and locations outside of México. Considerations regarding bots and the omission of anecdotal posts were also taken into account. The utility of graphs to observe interactions of political communication in general was confirmed by the analysis of visualizations generated with programs such as Gephi and NodeXL. However, some aspects require improvements to obtain more useful visual representations for this type of research. For example, link¬crossings complicates following the direction of an interaction forcing users to manipulate the graph to see it clearly. It was concluded that some practices prevalent in political communication in Mexico are replicated in Twitter. Media actors tend to group together instead of interact with others. The political system tends to tweet as an advertising strategy rather than to generate dialogue. However, some actors were identified as bridges establishing communication between the three spheres, generating a more democratic exercise and taking advantage of Twitter’s possibilities. Although interactions in Twitter could become an alternative to political communication, this potential depends on the intentions of the participants and to what extent they are aiming for collaborative and direct communications. Further research is needed to get a deeper understanding on the political behavior of Twitter users and the possibilities of SNA for its analysis.Keywords: interaction, political communication, social network analysis, Twitter
Procedia PDF Downloads 2213922 Dehydration of Residues from WTP for Application in Building Materials and Reuse of Water from the Waste Treatment: A Feasible Solution to Complete Treatment Systems
Authors: Marco Correa, Flavio Araujo, Paulo Scalize, Antonio Albuquerque
Abstract:
The increasing reduction of the volumes of surface water sources which supply most municipalities, as well as the continued rise of demand for treated water, combined with the disposal of effluents from washing of decanters and filters of the water treatment plants, generates a continuous search for correct environmentally solutions to these problems. The effluents generated by the water treatment industry need to be suitably processed for return to the environment or re-use. This article shows an alternative for the dehydration of sludge from the water treatment plants (WTP) and eventual disposal of sludge drained. Using the simple design methodology, we present a case study for a drainage in tanks geotextile, full-scale, which involve five sludge drainage tanks from WTP of the Rio Verde City. Aiming to the reutilization the water drained from the sludge and enabling its reuse both at the beginning of the treatment process at the WTP and in less noble services as for watering the gardens of the local town hall. The sludge will be used to production of building materials.Keywords: re-use, residue, sustainable, water treatment plants, sludge
Procedia PDF Downloads 4903921 Deposition of Diamond Like Carbon Thin Film by Pulse Laser Deposition for Surgical Instruments
Authors: M. Khalid Alamgir, Javed Ahsan Bhatti, M. Zafarullah Khan
Abstract:
Thin film of amorphous carbon (DLC) was deposited on 316 steel using Nd: YAG laser having energy 300mJ. Pure graphite was used as a target. The vacuum in the deposition chamber was generated in the range of 10-6 mbar by turbo molecular pump. Ratio of sp3 to sp2 content shows amorphous nature of the film. This was confirmed by Raman spectra having two peaks around 1300 cm-1 i.e. D-band to 1700 cm-1 i.e. G-band. If sp3 bonding ratio is high, the films behave like diamond-like whereas, with high sp2, films are graphite-like. The ratio of sp3 and sp2 contents in the film depends upon the deposition method, hydrogen contents and system parameters. The structural study of the film was carried out by XRD. The hardness of the films as measured by Vickers hardness tester and was found to be 28 GPa. The EDX result shows the presence of carbon contents on the surface in high rate and optical microscopy result shows the smoothness of the film on substrate. The film possesses good adhesion and can be used to coat surgical instruments.Keywords: DLC, thin film, Raman spectroscopy, XRD, EDX
Procedia PDF Downloads 5643920 Automatic Content Curation of Visual Heritage
Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz
Abstract:
Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research
Procedia PDF Downloads 1843919 The Importance of Visual Communication in Artificial Intelligence
Authors: Manjitsingh Rajput
Abstract:
Visual communication plays an important role in artificial intelligence (AI) because it enables machines to understand and interpret visual information, similar to how humans do. This abstract explores the importance of visual communication in AI and emphasizes the importance of various applications such as computer vision, object emphasis recognition, image classification and autonomous systems. In going deeper, with deep learning techniques and neural networks that modify visual understanding, In addition to AI programming, the abstract discusses challenges facing visual interfaces for AI, such as data scarcity, domain optimization, and interpretability. Visual communication and other approaches, such as natural language processing and speech recognition, have also been explored. Overall, this abstract highlights the critical role that visual communication plays in advancing AI capabilities and enabling machines to perceive and understand the world around them. The abstract also explores the integration of visual communication with other modalities like natural language processing and speech recognition, emphasizing the critical role of visual communication in AI capabilities. This methodology explores the importance of visual communication in AI development and implementation, highlighting its potential to enhance the effectiveness and accessibility of AI systems. It provides a comprehensive approach to integrating visual elements into AI systems, making them more user-friendly and efficient. In conclusion, Visual communication is crucial in AI systems for object recognition, facial analysis, and augmented reality, but challenges like data quality, interpretability, and ethics must be addressed. Visual communication enhances user experience, decision-making, accessibility, and collaboration. Developers can integrate visual elements for efficient and accessible AI systems.Keywords: visual communication AI, computer vision, visual aid in communication, essence of visual communication.
Procedia PDF Downloads 953918 Secure Network Coding against Content Pollution Attacks in Named Data Network
Authors: Tao Feng, Xiaomei Ma, Xian Guo, Jing Wang
Abstract:
Named Data Network (NDN) is one of the future Internet architecture, all nodes (i.e., hosts, routers) are allowed to have a local cache, used to satisfy incoming requests for content. However, depending on caching allows an adversary to perform attacks that are very effective and relatively easy to implement, such as content pollution attack. In this paper, we use a method of secure network coding based on homomorphic signature system to solve this problem. Firstly ,we use a dynamic public key technique, our scheme for each generation authentication without updating the initial secret key used. Secondly, employing the homomorphism of hash function, intermediate node and destination node verify the signature of the received message. In addition, when the network topology of NDN is simple and fixed, the code coefficients in our scheme are generated in a pseudorandom number generator in each node, so the distribution of the coefficients is also avoided. In short, our scheme not only can efficiently prevent against Intra/Inter-GPAs, but also can against the content poisoning attack in NDN.Keywords: named data networking, content polloution attack, network coding signature, internet architecture
Procedia PDF Downloads 3373917 Experimental Investigation on the Optimal Operating Frequency of a Thermoacoustic Refrigerator
Authors: Kriengkrai Assawamartbunlue, Channarong Wantha
Abstract:
This paper presents the effects of the mean operating pressure on the optimal operating frequency based on temperature differences across stack ends in a thermoacoustic refrigerator. In addition to the length of the resonance tube, components of the thermoacoustic refrigerator have an influence on the operating frequency due to their acoustic properties, i.e. absorptivity, reflectivity and transmissivity. The interference of waves incurs and distorts the original frequency generated by the driver so that the optimal operating frequency differs from the designs. These acoustic properties are not parameters in the designs and it is very complicated to infer their responses. A prototype thermoacoustic refrigerator is constructed and used to investigate its optimal operating frequency compared to the design at various operating pressures. Helium and air are used as working fluids during the experiments. The results indicate that the optimal operating frequency of the prototype thermoacoustic refrigerator using helium is at 6 bar and 490Hz or approximately 20% away from the design frequency. The optimal operating frequency at other mean pressures differs from the design in an unpredictable manner, however, the optimal operating frequency and pressure can be identified by testing.Keywords: acoustic properties, Carnot’s efficiency, interference of waves, operating pressure, optimal operating frequency, stack performance, standing wave, thermoacoustic refrigerator
Procedia PDF Downloads 4863916 Identification and Force Control of a Two Chambers Pneumatic Soft Actuator
Authors: Najib K. Dankadai, Ahmad 'Athif Mohd Faudzi, Khairuddin Osman, Muhammad Rusydi Muhammad Razif, IIi Najaa Aimi Mohd Nordin
Abstract:
Researches in soft actuators are now growing rapidly because of their adequacy to be applied in sectors like medical, agriculture, biological and welfare. This paper presents system identification (SI) and control of the force generated by a two chambers pneumatic soft actuator (PSA). A force mathematical model for the actuator was identified experimentally using data acquisition card and MATLAB SI toolbox. Two control techniques; a predictive functional control (PFC) and conventional proportional integral and derivative (PID) schemes are proposed and compared based on the identified model for the soft actuator flexible mechanism. Results of this study showed that both of the proposed controllers ensure accurate tracking when the closed loop system was tested with the step, sinusoidal and multi step reference input through MATLAB simulation although the PFC provides a better response than the PID.Keywords: predictive functional control (PFC), proportional integral and derivative (PID), soft actuator, system identification
Procedia PDF Downloads 3253915 Tribological Study of TiC Powder Cladding on 6061 Aluminum Alloy
Authors: Yuan-Ching Lin, Sin-Yu Chen, Pei-Yu Wu
Abstract:
This study reports the improvement in the wear performance of A6061 aluminum alloy clad with mixed powders of titanium carbide (TiC), copper (Cu) and aluminum (Al) using the gas tungsten arc welding (GTAW) method. The wear performance of the A6061 clad layers was evaluated by performing pin-on-disc mode wear test. Experimental results clearly indicate an enhancement in the hardness of the clad layer by about two times that of the A6061 substrate without cladding. Wear test demonstrated a significant improvement in the wear performance of the clad layer when compared with the A6061 substrate without cladding. Moreover, the interface between the clad layer and the A6061 substrate exhibited superior metallurgical bonding. Due to this bonding, the clad layer did not spall during the wear test; as such, massive wear loss was prevented. Additionally, massive oxidized particulate debris was generated on the worn surface during the wear test; this resulted in three-body abrasive wear and reduced the wear behavior of the clad surface.Keywords: GTAW、A6061 aluminum alloy, 、surface modification, tribological study, TiC powder cladding
Procedia PDF Downloads 4633914 Changing Human Resources Policies in Companies after the COVID-19 Pandemic
Authors: Murat Çolak, Elifnaz Tanyıldızı
Abstract:
Today, human mobility with globalization has increased the interaction between countries significantly; although this contact has advanced societies in terms of civilization, it has also increased the likelihood of pandemics. The coronavirus (COVID-19) pandemic, which caused the most loss of life among them, turned into a global epidemic by covering the whole world in a short time. While there was an explosion in demand in some businesses around the world, some businesses temporarily stopped or had to stop their activities. The businesses affected by the crisis had to adapt to the new legal regulations but had to make changes in matters such as their working styles, human resources practices, and policies. One of the measures taken into account is the reduction of the workforce. The current COVID-19 crisis has posed serious challenges for many organizations and has generated an unprecedented wave of termination notices. This study examined examples of companies affected by the pandemic process and changed their working policies after the pandemic. This study aims to reveal the impact of the global COVID-19 pandemic on human resources policies and employees and how these situations will affect businesses in the future.Keywords: human resource management, crisis management, COVID-19, business function
Procedia PDF Downloads 963913 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 1073912 Study of Atmospheric Cascades Generated by Primary Comic Rays, from Simulations in Corsika for the City of Tunja in Colombia
Authors: Tathiana Yesenia Coy Mondragón, Jossitt William Vargas Cruz, Cristian Leonardo Gutiérrez Gómez
Abstract:
The study of cosmic rays is based on two fundamental pillars: the detection of secondary cosmic rays on the Earth's surface and the detection of the source and origin of the cascade. In addition, the constant flow of RC generates a lot of interest for study due to the incidence of various natural phenomena, which makes it relevant to characterize their incidence parameters to determine their effect not only at subsoil or terrestrial surface levels but also throughout the atmosphere. To determine the physical parameters of the primary cosmic ray, the implementation of robust algorithms capable of reconstructing the cascade from the measured values is required, with a high level of reliability. Therefore, it is proposed to build a machine learning system that will be fed from the cosmic ray simulations in CORSIKA at different energies that lie in a range [10⁹-10¹²] eV. in order to generate a trained particle and pattern recognition system to obtain greater efficiency when inferring the nature of the origin of the cascade for EAS in the atmosphere considering atmospheric models.Keywords: CORSIKA, cosmic rays, eas, Colombia
Procedia PDF Downloads 813911 Low Density Parity Check Codes
Authors: Kassoul Ilyes
Abstract:
The field of error correcting codes has been revolutionized by the introduction of iteratively decoded codes. Among these, LDPC codes are now a preferred solution thanks to their remarkable performance and low complexity. The binary version of LDPC codes showed even better performance, although it’s decoding introduced greater complexity. This thesis studies the performance of binary LDPC codes using simplified weighted decisions. Information is transported between a transmitter and a receiver by digital transmission systems, either by propagating over a radio channel or also by using a transmission medium such as the transmission line. The purpose of the transmission system is then to carry the information from the transmitter to the receiver as reliably as possible. These codes have not generated enough interest within the coding theory community. This forgetfulness will last until the introduction of Turbo-codes and the iterative principle. Then it was proposed to adopt Pearl's Belief Propagation (BP) algorithm for decoding these codes. Subsequently, Luby introduced irregular LDPC codes characterized by a parity check matrix. And finally, we study simplifications on binary LDPC codes. Thus, we propose a method to make the exact calculation of the APP simpler. This method leads to simplifying the implementation of the system.Keywords: LDPC, parity check matrix, 5G, BER, SNR
Procedia PDF Downloads 1543910 Vehicle Speed Estimation Using Image Processing
Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha
Abstract:
In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision
Procedia PDF Downloads 843909 Disablism in Saudi Mainstream Schools: Disabled Teachers’ Experiences and Perspectives
Authors: Ali Aldakhil
Abstract:
This paper explores the many faces of the barriers and exclusionary attitudes and practices that disabled teachers and students experience in a school where they teach or attend. Critical disability studies and inclusive education theory were used to conceptualise this inquiry and ground it in the literature. These theories were used because they magnify and expose the problems of disability/disablism as within-society instead of within-individual. Similarly, disability-first language was used in this study because it seeks to expose the social oppression and discrimination of disabled. Data were generated through conducting in-depth semi-structured interviews with six disabled teachers who teach disabled children in a Saudi mainstream school. Thematic analysis of data concludes that the school is fettered by disabling barriers, attitudes, and practices, which reflect the dominant culture of disablism that disabled people encounter in the Saudi society on a daily basis. This leads to the conclusion that overall deconstruction and reformation of Saudi mainstream schools are needed, including non-disabled people’s attitudes, policy, spaces, and overall arrangements of teaching and learning.Keywords: disablism, disability studies, mainstream schools, Saudi Arabia
Procedia PDF Downloads 1593908 Mitigation of Electromagnetic Interference Generated by GPIB Control-Network in AC-DC Transfer Measurement System
Authors: M. M. Hlakola, E. Golovins, D. V. Nicolae
Abstract:
The field of instrumentation electronics is undergoing an explosive growth, due to its wide range of applications. The proliferation of electrical devices in a close working proximity can negatively influence each other’s performance. The degradation in the performance is due to electromagnetic interference (EMI). This paper investigates the negative effects of electromagnetic interference originating in the General Purpose Interface Bus (GPIB) control-network of the ac-dc transfer measurement system. Remedial measures of reducing measurement errors and failure of range of industrial devices due to EMI have been explored. The ac-dc transfer measurement system was analyzed for the common-mode (CM) EMI effects. Further investigation of coupling path as well as more accurate identification of noise propagation mechanism has been outlined. To prevent the occurrence of common-mode (ground loops) which was identified between the GPIB system control circuit and the measurement circuit, a microcontroller-driven GPIB switching isolator device was designed, prototyped, programmed and validated. This mitigation technique has been explored to reduce EMI effectively.Keywords: CM, EMI, GPIB, ground loops
Procedia PDF Downloads 2883907 Fast High Voltage Solid State Switch Using Insulated Gate Bipolar Transistor for Discharge-Pumped Lasers
Authors: Nur Syarafina Binti Othman, Tsubasa Jindo, Makato Yamada, Miho Tsuyama, Hitoshi Nakano
Abstract:
A novel method to produce a fast high voltage solid states switch using Insulated Gate Bipolar Transistors (IGBTs) is presented for discharge-pumped gas lasers. The IGBTs are connected in series to achieve a high voltage rating. An avalanche transistor is used as the gate driver. The fast pulse generated by the avalanche transistor quickly charges the large input capacitance of the IGBT, resulting in a switch out of a fast high-voltage pulse. The switching characteristic of fast-high voltage solid state switch has been estimated in the multi-stage series-connected IGBT with the applied voltage of several tens of kV. Electrical circuit diagram and the mythology of fast-high voltage solid state switch as well as experimental results obtained are presented.Keywords: high voltage, IGBT, solid state switch, bipolar transistor
Procedia PDF Downloads 5523906 Feature-Based Summarizing and Ranking from Customer Reviews
Authors: Dim En Nyaung, Thin Lai Lai Thein
Abstract:
Due to the rapid increase of Internet, web opinion sources dynamically emerge which is useful for both potential customers and product manufacturers for prediction and decision purposes. These are the user generated contents written in natural languages and are unstructured-free-texts scheme. Therefore, opinion mining techniques become popular to automatically process customer reviews for extracting product features and user opinions expressed over them. Since customer reviews may contain both opinionated and factual sentences, a supervised machine learning technique applies for subjectivity classification to improve the mining performance. In this paper, we dedicate our work is the task of opinion summarization. Therefore, product feature and opinion extraction is critical to opinion summarization, because its effectiveness significantly affects the identification of semantic relationships. The polarity and numeric score of all the features are determined by Senti-WordNet Lexicon. The problem of opinion summarization refers how to relate the opinion words with respect to a certain feature. Probabilistic based model of supervised learning will improve the result that is more flexible and effective.Keywords: opinion mining, opinion summarization, sentiment analysis, text mining
Procedia PDF Downloads 3323905 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance
Authors: Eva Laryea, Clement Yeboah Authors
Abstract:
A pretest-posttest within subjects, experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising, as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers, and will continue to be a dynamic and rapidly evolving field for years to come.Keywords: pretest-posttest within subjects, experimental design, achievement, statistics-related anxiety
Procedia PDF Downloads 583904 Automated CNC Part Programming and Process Planning for Turned Components
Authors: Radhey Sham Rajoria
Abstract:
Pressure to increase the competitiveness in the manufacturing sector and for the survival in the market has led to the development of machining centres, which enhance productivity, improve quality, shorten the lead time, and reduce the manufacturing cost. With the innovation of machining centres in the manufacturing sector the production lines have been replaced by these machining centers, having the ability to machine various processes and multiple tooling with automatic tool changer (ATC) for the same part. Also the process plans can be easily generated for complex components. Some means are required to utilize the machining center at its best. The present work is concentrated on the automated part program generation, and in turn automated process plan generation for the turned components on Denford “MIRAC” 8 stations ATC lathe machining centre. A package in C++ on DOS platform is developed which generates the complete CNC part program, process plan and process sequence for the turned components. The input to this system is in the form of a blueprint in graphical format with machining parameters and variables, and the output is the CNC part program which is stored in a .mir file, ready for execution on the machining centre.Keywords: CNC, MIRAC, ATC, process planning
Procedia PDF Downloads 2693903 Absence of Arbitrator Duty of Disclosure under the English Arbitration Act 1996
Authors: Qusai Alshahwan
Abstract:
The arbitrator’s duties of independence and impartiality play a significant role in delivering arbitral awards which legitimate the fundamental of arbitration concepts. For this reason, the international and national arbitration rules require arbitrators to be independent and impartial to solve the arbitration disputes fairly between the parties. However, solving the disputes fairly also requires arbitrators to disclose any existing conflicts of interest with the parties to avoid misunderstanding and late challenges. In contrary with the international and national arbitration rules, the English Arbitration Act 1996 does not include independence as a separate ground for arbitrator’s removal, and importantly the English Arbitration Act 1996 is deliberately silent to the arbitrator duty of disclosure. The absence of arbitrator duty of disclosure is an issue had generated uncertainty and concerns for the arbitration community under the English jurisdiction, particularly when the English courts rejected the IBA guidelines of arbitrator conflict of interest such as in case of Halliburton v Chubb for example. This article is highlighting on the legal consequences of the absence of arbitrator duty of disclosure under the English Arbitration Act 1996 and the arbitrator's contractual obligations.Keywords: arbitration, impartiality, independence, duty of disclosure, English Arbitration Act 1996
Procedia PDF Downloads 1313902 Retrofitting of Bridge Piers against the Scour Damages: Case Study of the Marand-Soofian Route Bridge
Authors: Shatirah Akib, Hossein Basser, Hojat Karami, Afshin Jahangirzadeh
Abstract:
Bridge piers which are constructed in the track of high water rivers cause some variations in the flow patterns. This variation mostly is a result of the changes in river sections. Decreasing the river section, bridge piers significantly impress the flow patterns. Once the flow approaches the piers, the stream lines change their order, causing the appearance of different flow patterns around the bridge piers. New flow patterns are created following the geometry and the other technical characteristics of the piers. One of the most significant consequences of this event is the scour generated around the bridge piers which threatens the safety of the structure. In order to determine the properties of scour holes, to find maximum depth of the scour is an important factor. In this manuscript a numerical simulation of the scour around Marand-Soofian route bridge piers has been carried out via SSIIM 2.0 Software and the amount of maximum scour has been achieved subsequently. Eventually the methods for retrofitting of bridge piers against scours and also the methods for decreasing the amount of scour have been offered.Keywords: scour, bridge pier, numerical simulation, SSIIM 2.0
Procedia PDF Downloads 4733901 Characteristics of Domestic Sewage in Small Urban Communities
Authors: Shohreh Azizi, Memory Tekere, Wag Nel
Abstract:
An evaluation of the characteristics of wastewater generated from small communities was carried out in relation to decentralized approach for domestic sewage treatment plant and design of biological nutrient removal system. The study included the survey of the waste from various individual communities such as a hotel, a residential complex, an office premise, and an educational institute. The results indicate that the concentration of organic pollutant in wastewater from the residential complex is higher than the waste from all the other communities with COD 664 mg/l, BOD 370.2 mg/l and TSS 248.8 mg/l. And the waste water from office premise indicates low organic load with COD428 mg/l, BOD 232mg/l and TSS 157mg/l. The wastewater from residential complex was studied under activated sludge process to evaluate this technology for decentralized wastewater treatment. The Activated sludge process was operated at different 12to 4 hrs hydraulic retention times and the optimum 6 hrs HRT was selected, therefore the average reduction of COD (85.92%) and BOD (91.28 %) was achieved. The issue of sludge recycling, maintenance of biomass concentration and high HRT reactor (10 L) volume are making the system non-practical for smaller communities.Keywords: wastewater, small communities, activated sludge process, decentralized system
Procedia PDF Downloads 3573900 Effects of Humidity and Silica Sand Particles on Vibration Generation by Friction Materials of Automotive Brake System
Authors: Mostafa M. Makrahy, Nouby M. Ghazaly, G. T. Abd el-Jaber
Abstract:
This paper presents the experimental study of vibration generated by friction materials of an automotive disc brake system using brake test rig. Effects of silica sand particles which are available on the road surface as an environmental condition with a size varied from 150 μm to 600 μm are evaluated. Also, the vibration of the brake disc is examined against the friction material in humidity environment conditions under variable rotational speed. The experimental results showed that the silica sand particles have significant contribution on the value of vibration amplitude which enhances with increasing the size of silica sand particles at different speed conditions. Also, it is noticed that the friction material is sensitive to humidity and the vibration magnitude increases under wet testing conditions. Moreover, it can be reported that with increasing the applied pressure and rotational speed of the braking system, the vibration amplitudes decrease for all cases.Keywords: disc brake vibration, friction-induced vibration, silica sand particles, brake operational and environmental conditions
Procedia PDF Downloads 1513899 The Feasibility of Using Milled Glass Wastes in Concrete to Resist Freezing-Thawing Action
Authors: Raed Abendeh, Mousa Bani Baker, Zaydoun Abu Salem, Hesham Ahmad
Abstract:
The using of waste materials in the construction industry can reduce the dependence on the natural aggregates which are going at the end to deplete. The glass waste is generated in a huge amount which can make one of its disposal in concrete industry effective not only as a green solution but also as an advantage to enhance the performance of mechanical properties and durability of concrete. This article reports the performance of concrete specimens containing different percentages of milled glass waste as a partial replacement of cement (Powder), when they are subject to cycles of freezing and thawing. The tests were conducted on 75-mm cubes and 75 x 75 x 300-mm prisms. Compressive strength based on laboratory testing and non-destructive ultrasonic pulse velocity test were performed during the action of freezing-thawing cycles (F/T). The results revealed that the incorporation of glass waste in concrete mixtures is not only feasible but also showed generally better strength and durability performance than control concrete mixture. It may be said that the recycling of waste glass in concrete mixes is not only a disposal way, but also it can be an exploitation in concrete industry.Keywords: durability, glass waste, freeze-thaw cycles, non-destructive test
Procedia PDF Downloads 378