Search results for: traditional scheduling algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7098

Search results for: traditional scheduling algorithms

3558 Disrupting Traditional Industries: A Scenario-Based Experiment on How Blockchain-Enabled Trust and Transparency Transform Nonprofit Organizations

Authors: Michael Mertel, Lars Friedrich, Kai-Ingo Voigt

Abstract:

Based on principle-agent theory, an information asymmetry exists in the traditional donation process. Consumers cannot comprehend whether nonprofit organizations (NPOs) use raised funds according to the designated cause after the transaction took place (hidden action). Therefore, charity organizations have tried to appear transparent and gain trust by using the same marketing instruments for decades (e.g., releasing project success reports). However, none of these measures can guarantee consumers that charities will use their donations for the purpose. With awareness of misuse of donations rising due to the Ukraine conflict (e.g., funding crime), consumers are increasingly concerned about the destination of their charitable purposes. Therefore, innovative charities like the Human Rights Foundation have started to offer donations via blockchain. Blockchain technology has the potential to establish profound trust and transparency in the donation process: Consumers can publicly track the progress of their donation at any time after deciding to donate. This ensures that the charity is not using donations against its original intent. Hence, the aim is to investigate the effect of blockchain-enabled transactions on the willingness to donate. Sample and Design: To investigate consumers' behavior, we use a scenario-based experiment. After removing participants (e.g., due to failed attention checks), 3192 potential donors participated (47.9% female, 62.4% bachelor or above). Procedure: We randomly assigned the participants to one of two scenarios. In all conditions, the participants read a scenario about a fictive charity organization called "Helper NPO." Afterward, the participants answered questions regarding their perception of the charity. Manipulation: The first scenario (n = 1405) represents a typical donation process, where consumers donate money without any option to track and trace. The second scenario (n = 1787) represents a donation process via blockchain, where consumers can track and trace their donations respectively. Using t-statistics, the findings demonstrate a positive effect of donating via blockchain on participants’ willingness to donate (mean difference = 0.667, p < .001, Cohen’s d effect size = 0.482). A mediation analysis shows significant effects for the mediation of transparency (Estimate = 0.199, p < .001), trust (Estimate = 0.144, p < .001), and transparency and trust (Estimate = 0.158, p < .001). The total effect of blockchain usage on participants’ willingness to donate (Estimate = 0.690, p < .001) consists of the direct effect (Estimate = 0.189, p < .001) and the indirect effects of transparency and trust (Estimate = 0.501, p < .001). Furthermore, consumers' affinity for technology moderates the direct effect of blockchain usage on participants' willingness to donate (Estimate = 0.150, p < .001). Donating via blockchain is a promising way for charities to engage consumers for several reasons: (1) Charities can emphasize trust and transparency in their advertising campaigns. (2) Established charities can target new customer segments by specifically engaging technology-affine consumers in the future. (3) Charities can raise international funds without previous barriers (e.g., setting up bank accounts). Nevertheless, increased transparency can also backfire (e.g., disclosure of costs). Such cases require further research.

Keywords: blockchain, social sector, transparency, trust

Procedia PDF Downloads 99
3557 Smart Disassembly of Waste Printed Circuit Boards: The Role of IoT and Edge Computing

Authors: Muhammad Mohsin, Fawad Ahmad, Fatima Batool, Muhammad Kaab Zarrar

Abstract:

The integration of the Internet of Things (IoT) and edge computing devices offers a transformative approach to electronic waste management, particularly in the dismantling of printed circuit boards (PCBs). This paper explores how these technologies optimize operational efficiency and improve environmental sustainability by addressing challenges such as data security, interoperability, scalability, and real-time data processing. Proposed solutions include advanced machine learning algorithms for predictive maintenance, robust encryption protocols, and scalable architectures that incorporate edge computing. Case studies from leading e-waste management facilities illustrate benefits such as improved material recovery efficiency, reduced environmental impact, improved worker safety, and optimized resource utilization. The findings highlight the potential of IoT and edge computing to revolutionize e-waste dismantling and make the case for a collaborative approach between policymakers, waste management professionals, and technology developers. This research provides important insights into the use of IoT and edge computing to make significant progress in the sustainable management of electronic waste

Keywords: internet of Things, edge computing, waste PCB disassembly, electronic waste management, data security, interoperability, machine learning, predictive maintenance, sustainable development

Procedia PDF Downloads 31
3556 Gamification to Enhance Learning Using Gagne's Learning Model

Authors: M. L. McLain, R. Sreelakshmi, Abhishek, Rajeshwaran, Bhavani Rao, Kamal Bijlani, R. Jayakrishnan

Abstract:

Technology enhanced learning has brought drastic changes in the field of education in the modern world. In this study we explore a novel way to improve how high school students learn by building a serious game that uses a pedagogical model developed by Robert Gagne. By integrating serious game with principles of Gagne’s learning model can provide engaging and meaningful instructions to students. The game developed in this study is a waste sorting game that can easily and succinctly demonstrate the principles of this learning model. All the tasks in the game that the player has to accomplish correspond to Gagne’s “Nine Events of Learning”. A quiz is incorporated in order to get data on the progress made by the player in understanding the concept and as well as to assess them. Additionally, an experimental study was conducted which demonstrates that game based learning using Gagne’s event is more effective than a traditional classroom setup.

Keywords: game based learning, sorting and recycling of waste, Gagne’s learning model, e-Learning, technology enhanced learning

Procedia PDF Downloads 631
3555 Efficient Fake News Detection Using Machine Learning and Deep Learning Approaches

Authors: Chaima Babi, Said Gadri

Abstract:

The rapid increase in fake news continues to grow at a very fast rate; this requires implementing efficient techniques that allow testing the re-liability of online content. For that, the current research strives to illuminate the fake news problem using deep learning DL and machine learning ML ap-proaches. We have developed the traditional LSTM (Long short-term memory), and the bidirectional BiLSTM model. A such process is to perform a training task on almost of samples of the dataset, validate the model on a subset called the test set to provide an unbiased evaluation of the final model fit on the training dataset, then compute the accuracy of detecting classifica-tion and comparing the results. For the programming stage, we used Tensor-Flow and Keras libraries on Python to support Graphical Processing Units (GPUs) that are being used for developing deep learning applications.

Keywords: machine learning, deep learning, natural language, fake news, Bi-LSTM, LSTM, multiclass classification

Procedia PDF Downloads 95
3554 Use of Benin Laterites for the Mix Design of Structural Concrete

Authors: Yemalin D. Agossou, Andre Lecomte, Remi Boissiere, Edmond C. Adjovi, Abdelouahab Khelil

Abstract:

This paper presents a mixed design trial of structural concretes with laterites from Benin. These materials are often the only granular resources readily available in many tropical regions. In the first step, concretes were designed with raw laterites, but the performances obtained were rather disappointing in spite of high cement dosages. A detailed physical characterization of these materials then showed that they contained a significant proportion of fine clays and that the coarsest fraction (gravel) contained a variety of facies, some of which were not very dense or indurated. Washing these laterites, and even the elimination of the most friable grains of the gravel fraction, made it possible to obtain concretes with satisfactory properties in terms of workability, density and mechanical strength. However, they were found to be slightly less stiff than concretes made with more traditional aggregates. It is, therefore, possible to obtain structural concretes with only laterites and cement but at the cost of eliminating some of their granular constituents.

Keywords: laterites, aggregates, concretes, mix design, mechanical properties

Procedia PDF Downloads 160
3553 Performance Evaluation of Routing Protocols for Video Conference over MPLS VPN Network

Authors: Abdullah Al Mamun, Tarek R. Sheltami

Abstract:

Video conferencing is a highly demanding facility now a days in order to its real time characteristics, but faster communication is the prior requirement of this technology. Multi Protocol Label Switching (MPLS) IP Virtual Private Network (VPN) address this problem and it is able to make a communication faster than others techniques. However, this paper studies the performance comparison of video traffic between two routing protocols namely the Enhanced Interior Gateway Protocol(EIGRP) and Open Shortest Path First (OSPF). The combination of traditional routing and MPLS improve the forwarding mechanism, scalability and overall network performance. We will use GNS3 and OPNET Modeler 14.5 to simulate many different scenarios and metrics such as delay, jitter and mean opinion score (MOS) value are measured. The simulation result will show that OSPF and BGP-MPLS VPN offers best performance for video conferencing application.

Keywords: OSPF, BGP, EIGRP, MPLS, Video conference, Provider router, edge router, layer3 VPN

Procedia PDF Downloads 331
3552 Parallel Pipelined Conjugate Gradient Algorithm on Heterogeneous Platforms

Authors: Sergey Kopysov, Nikita Nedozhogin, Leonid Tonkov

Abstract:

The article presents a parallel iterative solver for large sparse linear systems which can be used on a heterogeneous platform. Traditionally, the problem of solving linear systems does not scale well on multi-CPU/multi-GPUs clusters. For example, most of the attempts to implement the classical conjugate gradient method were at best counted in the same amount of time as the problem was enlarged. The paper proposes the pipelined variant of the conjugate gradient method (PCG), a formulation that is potentially better suited for hybrid CPU/GPU computing since it requires only one synchronization point per one iteration instead of two for standard CG. The standard and pipelined CG methods need the vector entries generated by the current GPU and other GPUs for matrix-vector products. So the communication between GPUs becomes a major performance bottleneck on multi GPU cluster. The article presents an approach to minimize the communications between parallel parts of algorithms. Additionally, computation and communication can be overlapped to reduce the impact of data exchange. Using the pipelined version of the CG method with one synchronization point, the possibility of asynchronous calculations and communications, load balancing between the CPU and GPU for solving the large linear systems allows for scalability. The algorithm is implemented with the combined use of technologies: MPI, OpenMP, and CUDA. We show that almost optimum speed up on 8-CPU/2GPU may be reached (relatively to a one GPU execution). The parallelized solver achieves a speedup of up to 5.49 times on 16 NVIDIA Tesla GPUs, as compared to one GPU.

Keywords: conjugate gradient, GPU, parallel programming, pipelined algorithm

Procedia PDF Downloads 165
3551 Healing Performances: Ethnographic Concepts and Emic Perspectives

Authors: S. Ishak, M. G. Nasuruddin

Abstract:

This paper looks at healing performances as ethnographic expressions of local knowledge and culture embedded within the Malay psyche and gemeinschaft. As society develops and progresses, these healing performances are caught within conflicting trajectories which become compounded by the contestations of tradition, religious concerns, locality and modernity. As exemplifications of the Malay ethos, these performances practice common rituals, cater to the innate needs of the practitioners and serve the targeted, closed, local community. This paper traces the ethnographic methods in documenting these practices as rituals of healing in a post-modern world. It delineates the ethnographic concepts used to analyze these rituals, and to semiotically read the varied binarial oppositions and juxtapositions. The paper concludes by highlighting the reconciliatory processes involved in maintaining these ritual performances as exemplifications of the Malay ethos playing an important role in the re-aligning, re-balancing and healing of the Malay community’s psyche.

Keywords: angina, winds, semangat, spirits, traditional theatres, trance

Procedia PDF Downloads 351
3550 Precision Assessment of the Orthometric Heights Determination in the Northern Part of Libya

Authors: Jamal A. Gledan, Akrm H. Algnin

Abstract:

The Global Positioning System (GPS) satellite-based technology has been utilized extensively in the last few years in a wide range of Geomatics and Geographic Information Systems (GIS) applications. One of the main challenges dealing with GPS-based heights consists of converting them into Mean Sea Level (MSL) heights which is used in surveys and mapping. In this research work, differences in heights of 50 points, in northern part of Libya were carried out using both ordinary levelling (in which Geoid is the reference datum) and GPS techniques (in which Ellipsoid is the reference datum). In addition, this study has utilized the EGM2008 model to obtain the undulation values between the ellipsoidal and orthometric heights. From these values with ellipsoidal heights which can be obtained from GPS observations to compute the orthomteric heights. This research presented a suitable alternative, from an economical point of view, to substitute the expensive traditional levelling technique particularly for topographic mapping.

Keywords: geoid undulation, GPS, ordinary and geodetic levelling, orthometric height

Procedia PDF Downloads 445
3549 Agroecology Approaches Towards Sustainable Agriculture and Food System: Reviewing and Exploring Selected Policies and Strategic Documents through an Agroecological Lens

Authors: Dereje Regasa

Abstract:

The global food system is at a crossroads, which requires prompt action to minimize the effects of the crises. Agroecology is gaining prominence due to its contributions to sustainable food systems. To support efforts in mitigating the crises, the Food and Agriculture Organization (FAO) established alternative approaches for sustainable agri-food systems. Agroecological elements and principles were developed to guide and support measures that countries need to achieve the Sustainable Development Goals (SDGs). The SDGs require the systemic integration of practices for a smart intensification or adaptation of traditional or industrial agriculture. As one of the countries working towards SDGs, the agricultural practices in Ethiopia need to be guided by these agroecological elements and principles. Aiming at the identification of challenging aspects of a sustainable agri-food system and the characterization of an enabling environment for agroecology, as well as exploring to what extent the existing policies and strategies support the agroecological transition process, five policy and strategy documents were reviewed. These documents are the Rural Development Policy and Strategy, the Environment Policy, the Biodiversity Policy, and the Soil Strategy of the Ministry of Agriculture (MoA). Using the Agroecology Criteria Tool (ACT), the contents were reviewed, focusing on agroecological requirements and the inclusion of sustainable practices. ACT is designed to support a self-assessment of elements supporting agroecology. For each element, binary values were assigned based on the inclusion of the minimum requirements index and then validated through discussion with the document owners. The results showed that the documents were well below the requirements for an agroecological transition of the agri-food system. The Rural Development Policy and Strategy only suffice to 83% in Human and Social Value. It does not support the transition concerning the other elements. The Biodiversity Policy and Soil Strategy suffice regarding the inclusion of Co-creation and Sharing of knowledge (100%), while the remaining elements were not considered sufficiently. In contrast, the Environment Policy supports the transition with three elements accounting for 100%. These are Resilience, Recycling, and Human and Social Care. However, when the four documents were combined, elements such as Synergies, Diversity, Efficiency, Human and Social value, Responsible governance, and Co-creation and Sharing of knowledge were identified as fully supportive (100%). This showed that the policies and strategies complemented one another to a certain extent. However, the evaluation results call for improvements concerning elements like Culture and food traditions, Circular and solidarity economy, Resilience, Recycling, and Regulation and balance since the majority of the elements were not sufficiently observed. Consequently, guidance for the smart intensification of local practices is needed, as well as traditional knowledge enriched with advanced technologies. Ethiopian agricultural and environmental policies and strategies should provide sufficient support and guidance for the intensification of sustainable practices and should provide a framework for an agroecological transition towards a sustainable agri-food system.

Keywords: agroecology, diversity, recycling, sustainable food system, transition

Procedia PDF Downloads 87
3548 3D Design of Orthotic Braces and Casts in Medical Applications Using Microsoft Kinect Sensor

Authors: Sanjana S. Mallya, Roshan Arvind Sivakumar

Abstract:

Orthotics is the branch of medicine that deals with the provision and use of artificial casts or braces to alter the biomechanical structure of the limb and provide support for the limb. Custom-made orthoses provide more comfort and can correct issues better than those available over-the-counter. However, they are expensive and require intricate modelling of the limb. Traditional methods of modelling involve creating a plaster of Paris mould of the limb. Lately, CAD/CAM and 3D printing processes have improved the accuracy and reduced the production time. Ordinarily, digital cameras are used to capture the features of the limb from different views to create a 3D model. We propose a system to model the limb using Microsoft Kinect2 sensor. The Kinect can capture RGB and depth frames simultaneously up to 30 fps with sufficient accuracy. The region of interest is captured from three views, each shifted by 90 degrees. The RGB and depth data are fused into a single RGB-D frame. The resolution of the RGB frame is 1920px x 1080px while the resolution of the Depth frame is 512px x 424px. As the resolution of the frames is not equal, RGB pixels are mapped onto the Depth pixels to make sure data is not lost even if the resolution is lower. The resulting RGB-D frames are collected and using the depth coordinates, a three dimensional point cloud is generated for each view of the Kinect sensor. A common reference system was developed to merge the individual point clouds from the Kinect sensors. The reference system consisted of 8 coloured cubes, connected by rods to form a skeleton-cube with the coloured cubes at the corners. For each Kinect, the region of interest is the square formed by the centres of the four cubes facing the Kinect. The point clouds are merged by considering one of the cubes as the origin of a reference system. Depending on the relative distance from each cube, the three dimensional coordinate points from each point cloud is aligned to the reference frame to give a complete point cloud. The RGB data is used to correct for any errors in depth data for the point cloud. A triangular mesh is generated from the point cloud by applying Delaunay triangulation which generates the rough surface of the limb. This technique forms an approximation of the surface of the limb. The mesh is smoothened to obtain a smooth outer layer to give an accurate model of the limb. The model of the limb is used as a base for designing the custom orthotic brace or cast. It is transferred to a CAD/CAM design file to design of the brace above the surface of the limb. The proposed system would be more cost effective than current systems that use MRI or CT scans for generating 3D models and would be quicker than using traditional plaster of Paris cast modelling and the overall setup time is also low. Preliminary results indicate that the accuracy of the Kinect2 is satisfactory to perform modelling.

Keywords: 3d scanning, mesh generation, Microsoft kinect, orthotics, registration

Procedia PDF Downloads 191
3547 The “Bright Side” of COVID-19: Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac Owusu Asante, Yushi Jiang, Hailin Tao

Abstract:

Live streaming marketing, the new electronic commerce element, became an optional marketing channel following the COVID-19 pandemic. Many sellers have leveraged the features presented by live streaming to increase sales. Studies on live streaming have focused on gaming and consumers’ loyalty to brands through live streaming, using interview questionnaires. This study, however, was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during live streaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study introduces a new way of measuring interactions in live streaming commerce and proposes a way to manually gather data on consumer behaviors in live streaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness

Procedia PDF Downloads 81
3546 Digital Joint Equivalent Channel Hybrid Precoding for Millimeterwave Massive Multiple Input Multiple Output Systems

Authors: Linyu Wang, Mingjun Zhu, Jianhong Xiang, Hanyu Jiang

Abstract:

Aiming at the problem that the spectral efficiency of hybrid precoding (HP) is too low in the current millimeter wave (mmWave) massive multiple input multiple output (MIMO) system, this paper proposes a digital joint equivalent channel hybrid precoding algorithm, which is based on the introduction of digital encoding matrix iteration. First, the objective function is expanded to obtain the relation equation, and the pseudo-inverse iterative function of the analog encoder is derived by using the pseudo-inverse method, which solves the problem of greatly increasing the amount of computation caused by the lack of rank of the digital encoding matrix and reduces the overall complexity of hybrid precoding. Secondly, the analog coding matrix and the millimeter-wave sparse channel matrix are combined into an equivalent channel, and then the equivalent channel is subjected to Singular Value Decomposition (SVD) to obtain a digital coding matrix, and then the derived pseudo-inverse iterative function is used to iteratively regenerate the simulated encoding matrix. The simulation results show that the proposed algorithm improves the system spectral efficiency by 10~20%compared with other algorithms and the stability is also improved.

Keywords: mmWave, massive MIMO, hybrid precoding, singular value decompositing, equivalent channel

Procedia PDF Downloads 96
3545 Defect Classification of Hydrogen Fuel Pressure Vessels using Deep Learning

Authors: Dongju Kim, Youngjoo Suh, Hyojin Kim, Gyeongyeong Kim

Abstract:

Acoustic Emission Testing (AET) is widely used to test the structural integrity of an operational hydrogen storage container, and clustering algorithms are frequently used in pattern recognition methods to interpret AET results. However, the interpretation of AET results can vary from user to user as the tuning of the relevant parameters relies on the user's experience and knowledge of AET. Therefore, it is necessary to use a deep learning model to identify patterns in acoustic emission (AE) signal data that can be used to classify defects instead. In this paper, a deep learning-based model for classifying the types of defects in hydrogen storage tanks, using AE sensor waveforms, is proposed. As hydrogen storage tanks are commonly constructed using carbon fiber reinforced polymer composite (CFRP), a defect classification dataset is collected through a tensile test on a specimen of CFRP with an AE sensor attached. The performance of the classification model, using one-dimensional convolutional neural network (1-D CNN) and synthetic minority oversampling technique (SMOTE) data augmentation, achieved 91.09% accuracy for each defect. It is expected that the deep learning classification model in this paper, used with AET, will help in evaluating the operational safety of hydrogen storage containers.

Keywords: acoustic emission testing, carbon fiber reinforced polymer composite, one-dimensional convolutional neural network, smote data augmentation

Procedia PDF Downloads 93
3544 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 66
3543 Swedish–Nigerian Extrusion Research: Channel for Traditional Grain Value Addition

Authors: Kalep Filli, Sophia Wassén, Annika Krona, Mats Stading

Abstract:

Food security challenge and the growing population in Sub-Saharan Africa centers on its agricultural transformation, where about 70% of its population is directly involved in farming. Research input can create economic opportunities, reduce malnutrition and poverty, and generate faster, fairer growth. Africa is discarding $4 billion worth of grain annually due to pre and post-harvest losses. Grains and tubers play a central role in food supply in the region but their production has generally lagged behind because no robust scientific input to meet up with the challenge. The African grains are still chronically underutilized to the detriment of the well-being of the people of Africa and elsewhere. The major reason for their underutilization is because they are under-researched. Any commitment by scientific community to intervene needs creative solutions focused on innovative approaches that will meet the economic growth. In order to mitigate this hurdle, co-creation activities and initiatives are necessary.An example of such initiatives has been initiated through Modibbo Adama University of Technology Yola, Nigeria and RISE (The Research Institutes of Sweden) Gothenburg, Sweden. Exchange of expertise in research activities as a possibility to create channel for value addition to agricultural commodities in the region under the ´Traditional Grain Network programme´ is in place. Process technologies, such as extrusion offers the possibility of creating products in the food and feed sectors, with better storage stability, added value, lower transportation cost and new markets. The Swedish–Nigerian initiative has focused on the development of high protein pasta. Dry microscopy of pasta sample result shows a continuous structural framework of proteins and starch matrix. The water absorption index (WAI) results showed that water was absorbed steadily and followed the master curve pattern. The WAI values ranged between 250 – 300%. In all aspect, the water absorption history was within a narrow range for all the eight samples. The total cooking time for all the eight samples in our study ranged between 5 – 6 minutes with their respective dry sample diameter ranging between 1.26 – 1.35 mm. The percentage water solubility index (WSI) ranged from 6.03 – 6.50% which was within a narrow range and the cooking loss which is a measure of WSI is considered as one of the main parameters taken into consideration during the assessment of pasta quality. The protein contents of the samples ranged between 17.33 – 18.60 %. The value of the cooked pasta firmness ranged from 0.28 - 0.86 N. The result shows that increase in ratio of cowpea flour and level of pregelatinized cowpea tends to increase the firmness of the pasta. The breaking strength represent index of toughness of the dry pasta ranged and it ranged from 12.9 - 16.5 MPa.

Keywords: cowpea, extrusion, gluten free, high protein, pasta, sorghum

Procedia PDF Downloads 196
3542 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 191
3541 PaSA: A Dataset for Patent Sentiment Analysis to Highlight Patent Paragraphs

Authors: Renukswamy Chikkamath, Vishvapalsinhji Ramsinh Parmar, Christoph Hewel, Markus Endres

Abstract:

Given a patent document, identifying distinct semantic annotations is an interesting research aspect. Text annotation helps the patent practitioners such as examiners and patent attorneys to quickly identify the key arguments of any invention, successively providing a timely marking of a patent text. In the process of manual patent analysis, to attain better readability, recognising the semantic information by marking paragraphs is in practice. This semantic annotation process is laborious and time-consuming. To alleviate such a problem, we proposed a dataset to train machine learning algorithms to automate the highlighting process. The contributions of this work are: i) we developed a multi-class dataset of size 150k samples by traversing USPTO patents over a decade, ii) articulated statistics and distributions of data using imperative exploratory data analysis, iii) baseline Machine Learning models are developed to utilize the dataset to address patent paragraph highlighting task, and iv) future path to extend this work using Deep Learning and domain-specific pre-trained language models to develop a tool to highlight is provided. This work assists patent practitioners in highlighting semantic information automatically and aids in creating a sustainable and efficient patent analysis using the aptitude of machine learning.

Keywords: machine learning, patents, patent sentiment analysis, patent information retrieval

Procedia PDF Downloads 91
3540 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 381
3539 The Emotional Experience of Urban Ruins and the Exploration of Urban Memory

Authors: Yan Jia China

Abstract:

The ruins is a kind of historical intention, which is also the current real existence of developing city. Zen culture of ancient China has a profound esthetic emotion, similarly, the west establish the concept of aesthetics of relic along with the Romanism’s (such as Rousseau etc.) sentiment to historical ruins at the end of 18th century. Nowadays, with the decline of traditional industrial society as well as the rise of post-industrial age, contemporary society must face the ruins and garbage problem which is left by industrial society. Commencing from the perspective of emotion and memory, this paper analyzes the importance for emotional needs as well as their existing status of several projects, such as the Capital Steelworks in Beijing (industrial devastation), the Shibati old section in Chongqing (urban slums) and the Old Hurva Synagogue in Jerusalem (ruins of war). It emphasizes urban design which is started from emotion and the sustainable development of city memory through managing the urban ruins which is criticized by people with the perspective of ecology and art.

Keywords: cultural heritage, urban ruins, ecology, emotion, sustainable urban memory

Procedia PDF Downloads 440
3538 A Literature Review of Servant Leadership and Criticism of Advanced Research

Authors: So-Jung Kim, Kyoung-Seok Kim, Yeong-Gyeong Choi

Abstract:

Although there are many theories and discussion of leadership, the necessity of having a new leadership paradigm was emphasized. The existing leadership characteristic of instruction and control revealed its limitations. Market competition becomes fierce and economic recession never ends worldwide. Of the leadership theories, servant leadership was introduced recently and is in line with the environmental changes of the organization. Servant leadership is a combination of two words, 'servant' and 'leader' and can be defined as the role of the leader who focuses on doing voluntary work for others with altruistic ethics, makes members, customers, and local communities a priority, and makes a commitment to satisfying their needs. This leadership received attention as one field of leadership in the late 1990s and secured its legitimacy. This study discusses the existing research trends of leadership, the concept, behavior characteristics, and lower dimensions of servant leadership, compares servant leadership with the existing leadership researches and diagnoses if servant leadership is a useful concept for further leadership researches. Finally, this study criticizes the limitations in the existing researches on servant leadership.

Keywords: leadership philosophy, leadership theory, servant leadership, traditional leadership

Procedia PDF Downloads 363
3537 Detection of Adulterants in Milk Using IoT

Authors: Shaik Mohammad Samiullah Shariff, Siva Sreenath, Sai Haripriya, Prathyusha, M. Padma Lalitha

Abstract:

The Internet of Things (IoT) is the emerging technology that has been utilized to extend the possibilities for smart dairy farming (SDF). Milk consumption is continually increasing due to the world's growing population. As a result, some providers are prone to using dishonest measures to close the supply-demand imbalance, such as adding adulterants to milk. To identify the presence of adulterants in milk, traditional testing methods necessitate the use of particular chemicals and equipment. While efficient, this method has the disadvantage of yielding difficult and time-consuming qualitative results. Furthermore, same milk sample cannot be tested for other adulterants later. As a result, this study proposes an IoT-based approach for identifying adulterants in milk by measuring electrical conductivity (EC) or Total Dissolved Solids (TDS) and PH. In order to achieve this, an Arduino UNO microcontroller is used to assess the contaminants. When there is no adulteration, the pH and TDS values of milk range from 6.45 to 6.67 and 750 to 780ppm, respectively, according to this study. Finally, the data is uploaded to the cloud via an IoT device attached to the Ubidot web platform.

Keywords: internet of things (IoT), pH sensor, TDS sensor, EC sensor, industry 4.0

Procedia PDF Downloads 78
3536 Soft Pneumatic Actuators Fabricated Using Soluble Polymer Inserts and a Single-Pour System for Improved Durability

Authors: Alexander Harrison Greer, Edward King, Elijah Lee, Safa Obuz, Ruhao Sun, Aditya Sardesai, Toby Ma, Daniel Chow, Bryce Broadus, Calvin Costner, Troy Barnes, Biagio DeSimone, Yeshwin Sankuratri, Yiheng Chen, Holly Golecki

Abstract:

Although a relatively new field, soft robotics is experiencing a rise in applicability in the secondary school setting through The Soft Robotics Toolkit, shared fabrication resources and a design competition. Exposing students outside of university research groups to this rapidly growing field allows for development of the soft robotics industry in new and imaginative ways. Soft robotic actuators have remained difficult to implement in classrooms because of their relative cost or difficulty of fabrication. Traditionally, a two-part molding system is used; however, this configuration often results in delamination. In an effort to make soft robotics more accessible to young students, we aim to develop a simple, single-mold method of fabricating soft robotic actuators from common household materials. These actuators are made by embedding a soluble polymer insert into silicone. These inserts can be made from hand-cut polystyrene, 3D-printed polyvinyl alcohol (PVA) or acrylonitrile butadiene styrene (ABS), or molded sugar. The insert is then dissolved using an appropriate solvent such as water or acetone, leaving behind a negative form which can be pneumatically actuated. The resulting actuators are seamless, eliminating the instability of adhering multiple layers together. The benefit of this approach is twofold: it simplifies the process of creating a soft robotic actuator, and in turn, increases its effectiveness and durability. To quantify the increased durability of the single-mold actuator, it was tested against the traditional two-part mold. The single-mold actuator could withstand actuation at 20psi for 20 times the duration when compared to the traditional method. The ease of fabrication of these actuators makes them more accessible to hobbyists and students in classrooms. After developing these actuators, they were applied, in collaboration with a ceramics teacher at our school, to a glove used to transfer nuanced hand motions used to throw pottery from an expert artist to a novice. We quantified the improvement in the users’ pottery-making skill when wearing the glove using image analysis software. The seamless actuators proved to be robust in this dynamic environment. Seamless soft robotic actuators created by high school students show the applicability of the Soft Robotics Toolkit for secondary STEM education and outreach. Making students aware of what is possible through projects like this will inspire the next generation of innovators in materials science and robotics.

Keywords: pneumatic actuator fabrication, soft robotic glove, soluble polymers, STEM outreach

Procedia PDF Downloads 134
3535 Aerodynamic Modelling of Unmanned Aerial System through Computational Fluid Dynamics: Application to the UAS-S45 Balaam

Authors: Maxime A. J. Kuitche, Ruxandra M. Botez, Arthur Guillemin

Abstract:

As the Unmanned Aerial Systems have found diverse utilities in both military and civil aviation, the necessity to obtain an accurate aerodynamic model has shown an enormous growth of interest. Recent modeling techniques are procedures using optimization algorithms and statistics that require many flight tests and are therefore extremely demanding in terms of costs. This paper presents a procedure to estimate the aerodynamic behavior of an unmanned aerial system from a numerical approach using computational fluid dynamic analysis. The study was performed using an unstructured mesh obtained from a grid convergence analysis at a Mach number of 0.14, and at an angle of attack of 0°. The flow around the aircraft was described using a standard k-ω turbulence model. Thus, the Reynold Averaged Navier-Stokes (RANS) equations were solved using ANSYS FLUENT software. The method was applied on the UAS-S45 designed and manufactured by Hydra Technologies in Mexico. The lift, the drag, and the pitching moment coefficients were obtained at different angles of attack for several flight conditions defined in terms of altitudes and Mach numbers. The results obtained from the Computational Fluid Dynamics analysis were compared with the results obtained by using the DATCOM semi-empirical procedure. This comparison has indicated that our approach is highly accurate and that the aerodynamic model obtained could be useful to estimate the flight dynamics of the UAS-S45.

Keywords: aerodynamic modelling, CFD Analysis, ANSYS FLUENT, UAS-S45

Procedia PDF Downloads 375
3534 On the Framework of Contemporary Intelligent Mathematics Underpinning Intelligent Science, Autonomous AI, and Cognitive Computers

Authors: Yingxu Wang, Jianhua Lu, Jun Peng, Jiawei Zhang

Abstract:

The fundamental demand in contemporary intelligent science towards Autonomous AI (AI*) is the creation of unprecedented formal means of Intelligent Mathematics (IM). It is discovered that natural intelligence is inductively created rather than exhaustively trained. Therefore, IM is a family of algebraic and denotational mathematics encompassing Inference Algebra, Real-Time Process Algebra, Concept Algebra, Semantic Algebra, Visual Frame Algebra, etc., developed in our labs. IM plays indispensable roles in training-free AI* theories and systems beyond traditional empirical data-driven technologies. A set of applications of IM-driven AI* systems will be demonstrated in contemporary intelligence science, AI*, and cognitive computers.

Keywords: intelligence mathematics, foundations of intelligent science, autonomous AI, cognitive computers, inference algebra, real-time process algebra, concept algebra, semantic algebra, applications

Procedia PDF Downloads 61
3533 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 356
3532 Creativity and Stereotype Threat: Analysis of the Impact of Creativity on Eliminating the Stereotype Threat in the Educational Setting

Authors: Aleksandra Gajda

Abstract:

Among students between 12 and 13, the probability of activating the stereotype threat increases noticeably. Girls consider themselves weaker in science, while boys consider themselves weaker in the field of language skills. This phenomenon is disturbing because it may result in wrong choices of the further path of education, not consistent with the actual competences of the students. Meanwhile, negative effects of the stereotype threat, observable in the loss of focus on the task and transferring it to dealing with fear of failure, can be reduced by various factors. The study examined the impact of creativity on eliminating the stereotype threat. The experiment in the form of a 2 (gender: male vs. female) x 3 (traditional gender roles: neutral version vs. nontraditional gender roles) x 2 (creativity: low vs. high) factorial design was conducted. The results showed that a high level of creative abilities may reduce the negative effects of stereotype threat in educational setting.

Keywords: creativity, education, language skills, mathematical skills, stereotype threat

Procedia PDF Downloads 119
3531 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 489
3530 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning

Procedia PDF Downloads 446
3529 Coupling Large Language Models with Disaster Knowledge Graphs for Intelligent Construction

Authors: Zhengrong Wu, Haibo Yang

Abstract:

In the context of escalating global climate change and environmental degradation, the complexity and frequency of natural disasters are continually increasing. Confronted with an abundance of information regarding natural disasters, traditional knowledge graph construction methods, which heavily rely on grammatical rules and prior knowledge, demonstrate suboptimal performance in processing complex, multi-source disaster information. This study, drawing upon past natural disaster reports, disaster-related literature in both English and Chinese, and data from various disaster monitoring stations, constructs question-answer templates based on large language models. Utilizing the P-Tune method, the ChatGLM2-6B model is fine-tuned, leading to the development of a disaster knowledge graph based on large language models. This serves as a knowledge database support for disaster emergency response.

Keywords: large language model, knowledge graph, disaster, deep learning

Procedia PDF Downloads 56