Search results for: dynamic capability approach
15096 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning
Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem
Abstract:
The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.Keywords: connectivism, learning analytics, lifelong learning, social semantic web
Procedia PDF Downloads 21515095 Enframing the Smart City: Utilizing Heidegger's 'The Question Concerning Technology' as a Framework to Interpret Smart Urbanism
Authors: Will Brown
Abstract:
Martin Heidegger is considered to be one of the leading philosophical lights of the 20th century with his lecture/essay 'The Question Concerning Technology' proving to be an invaluable text in the study of technology and the understanding of how technology influences the world it is set upon. However, this text has not as of yet been applied to the rapid rise and proliferation of ‘smart’ cities. This article is premised upon the application of the aforementioned text and the smart city in order to provide a fresh, if not critical analysis and interpretation of this phenomena. The first section below provides a brief literature review of smart urbanism in order to lay the groundwork necessary to apply Heidegger’s work to the smart city, from which a framework is developed to interpret the infusion of digital sensing technologies and the urban milieu. This framework is comprised of four concepts put forward in Heidegger’s text: circumscribing, bringing-forth, challenging, and standing-reserve. A concluding chapter is based upon the notion of enframement, arguing that once the rubric of data collection is placed within the urban system, future systems will require the capability to harvest data, resulting in an ever-renewing smart city.Keywords: air quality sensing, big data, Martin Heidegger, smart city
Procedia PDF Downloads 20815094 How Supply Chains Can Benefit from Open Innovation: Inspiration from Toyota Production System
Authors: Sam Solaimani, Jack A. A. van der Veen, Mehdi Latifi
Abstract:
Considering the increasingly VUCA (Volatile, Uncertain, Complex, Ambiguous) business market, innovation is the name of the game in contemporary business. Innovation is not solely created within the organization itself; its 'network environment' appears to be equally important for innovation. There are, at least, two streams of literature that emphasize the idea of using the extended organization to foster innovation capability, namely, Supply Chain Collaboration (SCC) (also rooted in the Lean philosophy) and Open Innovation (OI). Remarkably, these two concepts are still considered as being totally different in the sense that these appear in different streams of literature and applying different concepts in pursuing the same purposes. This paper explores the commonalities between the two concepts in order to conceptually further our understanding of how OI can effectively be applied in Supply Chain networks. Drawing on available literature in OI, SCC and Lean, the paper concludes with five principles that help firms to contextualize the implementation of OI to the peculiar setting of SC. Theoretically, the present paper aims at contributing to the relatively under-researched theme of Supply Chain Innovation. More in practical terms, the paper provides OI and SCC communities with a workable know-how to seize on and sustain OI initiatives.Keywords: lean philosophy, open innovation, supply chain collaboration, supply chain management
Procedia PDF Downloads 32315093 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator
Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula
Abstract:
A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.Keywords: ISA, neural network, Brain Float-16, DUT
Procedia PDF Downloads 9415092 User-Perceived Quality Factors for Certification Model of Web-Based System
Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh
Abstract:
One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system
Procedia PDF Downloads 40515091 Health Status Monitoring of COVID-19 Patient's through Blood Tests and Naïve-Bayes
Authors: Carlos Arias-Alcaide, Cristina Soguero-Ruiz, Paloma Santos-Álvarez, Adrián García-Romero, Inmaculada Mora-Jiménez
Abstract:
Analysing clinical data with computers in such a way that have an impact on the practitioners’ workflow is a challenge nowadays. This paper provides a first approach for monitoring the health status of COVID-19 patients through the use of some biomarkers (blood tests) and the simplest Naïve Bayes classifier. Data of two Spanish hospitals were considered, showing the potential of our approach to estimate reasonable posterior probabilities even some days before the event.Keywords: Bayesian model, blood biomarkers, classification, health tracing, machine learning, posterior probability
Procedia PDF Downloads 23315090 Image Processing Approach for Detection of Three-Dimensional Tree-Rings from X-Ray Computed Tomography
Authors: Jorge Martinez-Garcia, Ingrid Stelzner, Joerg Stelzner, Damian Gwerder, Philipp Schuetz
Abstract:
Tree-ring analysis is an important part of the quality assessment and the dating of (archaeological) wood samples. It provides quantitative data about the whole anatomical ring structure, which can be used, for example, to measure the impact of the fluctuating environment on the tree growth, for the dendrochronological analysis of archaeological wooden artefacts and to estimate the wood mechanical properties. Despite advances in computer vision and edge recognition algorithms, detection and counting of annual rings are still limited to 2D datasets and performed in most cases manually, which is a time consuming, tedious task and depends strongly on the operator’s experience. This work presents an image processing approach to detect the whole 3D tree-ring structure directly from X-ray computed tomography imaging data. The approach relies on a modified Canny edge detection algorithm, which captures fully connected tree-ring edges throughout the measured image stack and is validated on X-ray computed tomography data taken from six wood species.Keywords: ring recognition, edge detection, X-ray computed tomography, dendrochronology
Procedia PDF Downloads 22115089 Inclusive Cities Decision Matrix Based on a Multidimensional Approach for Sustainable Smart Cities
Authors: Madhurima S. Waghmare, Shaleen Singhal
Abstract:
The concept of smartness, inclusion, sustainability is multidisciplinary and fuzzy, rooted in economic and social development theories and policies which get reflected in the spatial development of the cities. It is a challenge to convert these concepts from aspirations to transforming actions. There is a dearth of assessment and planning tools to support the city planners and administrators in developing smart, inclusive, and sustainable cities. To address this gap, this study develops an inclusive cities decision matrix based on an exploratory approach and using mixed methods. The matrix is soundly based on a review of multidisciplinary urban sector literature and refined and finalized based on inputs from experts and insights from case studies. The application of the decision matric on the case study cities in India suggests that the contemporary planning tools for cities need to be multidisciplinary and flexible to respond to the unique needs of the diverse contexts. The paper suggests that a multidimensional and inclusive approach to city planning can play an important role in building sustainable smart cities.Keywords: inclusive-cities decision matrix, smart cities in India, city planning tools, sustainable cities
Procedia PDF Downloads 15615088 Light Harvesting Titanium Nanocatalyst for Remediation of Methyl Orange
Authors: Brajesh Kumar, Luis Cumbal
Abstract:
An eco-friendly Citrus paradisi peel extract mediated synthesis of TiO2 nanoparticles is reported under sonication. U.V.-vis, Transmission Electron Microscopy, Dynamic Light Scattering and X-ray analyses are performed to characterize the formation of TiO2 nanoparticles. It is almost spherical in shape, having a size of 60–140 nm and the XRD peaks at 2θ = 25.363° confirm the characteristic facets for anatase form. The synthesized nano catalyst is highly active in the decomposition of methyl orange (64 mg/L) in sunlight (~73%) for 2.5 hours.Keywords: eco-friendly, TiO2 nanoparticles, citrus paradisi, TEM
Procedia PDF Downloads 52515087 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 55315086 Analyzing Water Waves in Underground Pumped Storage Reservoirs: A Combined 3D Numerical and Experimental Approach
Authors: Elena Pummer, Holger Schuettrumpf
Abstract:
By today underground pumped storage plants as an outstanding alternative for classical pumped storage plants do not exist. They are needed to ensure the required balance between production and demand of energy. As a short to medium term storage pumped storage plants have been used economically over a long period of time, but their expansion is limited locally. The reasons are in particular the required topography and the extensive human land use. Through the use of underground reservoirs instead of surface lakes expansion options could be increased. Fulfilling the same functions, several hydrodynamic processes result in the specific design of the underground reservoirs and must be implemented in the planning process of such systems. A combined 3D numerical and experimental approach leads to currently unknown results about the occurring wave types and their behavior in dependence of different design and operating criteria. For the 3D numerical simulations, OpenFOAM was used and combined with an experimental approach in the laboratory of the Institute of Hydraulic Engineering and Water Resources Management at RWTH Aachen University, Germany. Using the finite-volume method and an explicit time discretization, a RANS-Simulation (k-ε) has been run. Convergence analyses for different time discretization, different meshes etc. and clear comparisons between both approaches lead to the result, that the numerical and experimental models can be combined and used as hybrid model. Undular bores partly with secondary waves and breaking bores occurred in the underground reservoir. Different water levels and discharges change the global effects, defined as the time-dependent average of the water level as well as the local processes, defined as the single, local hydrodynamic processes (water waves). Design criteria, like branches, directional changes, changes in cross-section or bottom slope, as well as changes in roughness have a great effect on the local processes, the global effects remain unaffected. Design calculations for underground pumped storage plants were developed on the basis of existing formulae and the results of the hybrid approach. Using the design calculations reservoirs heights as well as oscillation periods can be determined and lead to the knowledge of construction and operation possibilities of the plants. Consequently, future plants can be hydraulically optimized applying the design calculations on the local boundary conditions.Keywords: energy storage, experimental approach, hybrid approach, undular and breaking Bores, 3D numerical approach
Procedia PDF Downloads 21315085 Right-Wing Narratives Associated with Cognitive Predictors of Radicalization: Direct User Engagement Drives Radicalization
Authors: Julius Brejohn Calvert
Abstract:
This Study Aimed to Investigate the Ecological Nature of Extremism Online. The Construction of a Far-Right Ecosystem Was Successful Using a Sample of Posts, Each With Separate Narrative Domains. Most of the Content Expressed Anti-black Racism and Pro-white Sentiments. Many Posts Expressed an Overt Disdain for the Recent Progress Made Regarding the United States and the United Kingdom’s Expansion of Civil Liberties to People of Color (Poc). Of Special Note, Several Anti-lgbt Posts Targeted the Ongoing Political Grievances Expressed by the Transgender Community. Overall, the Current Study Is Able to Demonstrate That Direct Measures of User Engagement, Such as Shares and Reactions, Can Be Used to Predict the Effect of a Post’s Radicalization Capabilities, Although Single Posts Do Not Operate on the Cognitive Processes of Radicalization Alone. In This Analysis, the Data Supports a Theoretical Framework Where Individual Posts Have a Higher Radicalization Capability Based on the Amount of User Engagement (Both Indirect and Direct) It Receives.Keywords: cognitive psychology, cognitive radicalization, extremism online, domestic extremism, political science, political psychology
Procedia PDF Downloads 7115084 Anthraquinone Labelled DNA for Direct Detection and Discrimination of Closely Related DNA Targets
Authors: Sarah A. Goodchild, Rachel Gao, Philip N. Bartlett
Abstract:
A novel detection approach using immobilized DNA probes labeled with Anthraquinone (AQ) as an electrochemically active reporter moiety has been successfully developed as a new, simple, reliable method for the detection of DNA. This method represents a step forward in DNA detection as it can discriminate between multiple nucleotide polymorphisms within target DNA strands without the need for any additional reagents, reporters or processes such as melting of DNA strands. The detection approach utilizes single-stranded DNA probes immobilized on gold surfaces labeled at the distal terminus with AQ. The effective immobilization has been monitored using techniques such as AC impedance and Raman spectroscopy. Simple voltammetry techniques (Differential Pulse Voltammetry, Cyclic Voltammetry) are then used to monitor the reduction potential of the AQ before and after the addition of complementary strand of target DNA. A reliable relationship between the shift in reduction potential and the number of base pair mismatch has been established and can be used to discriminate between DNA from highly related pathogenic organisms of clinical importance. This indicates that this approach may have great potential to be exploited within biosensor kits for detection and diagnosis of pathogenic organisms in Point of Care devices.Keywords: Anthraquinone, discrimination, DNA detection, electrochemical biosensor
Procedia PDF Downloads 39315083 Adaptive Multiple Transforms Hardware Architecture for Versatile Video Coding
Authors: T. Damak, S. Houidi, M. A. Ben Ayed, N. Masmoudi
Abstract:
The Versatile Video Coding standard (VVC) is actually under development by the Joint Video Exploration Team (or JVET). An Adaptive Multiple Transforms (AMT) approach was announced. It is based on different transform modules that provided an efficient coding. However, the AMT solution raises several issues especially regarding the complexity of the selected set of transforms. This can be an important issue, particularly for a future industrial adoption. This paper proposed an efficient hardware implementation of the most used transform in AMT approach: the DCT II. The developed circuit is adapted to different block sizes and can reach a minimum frequency of 192 MHz allowing an optimized execution time.Keywords: adaptive multiple transforms, AMT, DCT II, hardware, transform, versatile video coding, VVC
Procedia PDF Downloads 14715082 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 19415081 Dynamic Contrast-Enhanced Breast MRI Examinations: Clinical Use and Technical Challenges
Authors: Janet Wing-Chong Wai, Alex Chiu-Wing Lee, Hailey Hoi-Ching Tsang, Jeffrey Chiu, Kwok-Wing Tang
Abstract:
Background: Mammography has limited sensitivity and specificity though it is the primary imaging technique for detection of early breast cancer. Ultrasound imaging and contrast-enhanced MRI are useful adjunct tools to mammography. The advantage of breast MRI is high sensitivity for invasive breast cancer. Therefore, indications for and use of breast magnetic resonance imaging have increased over the past decade. Objectives: 1. Cases demonstration on different indications for breast MR imaging. 2. To review of the common artifacts and pitfalls in breast MR imaging. Materials and Methods: This is a retrospective study including all patients underwent dynamic contrast-enhanced breast MRI examination in our centre, performed from Jan 2011 to Dec 2017. The clinical data and radiological images were retrieved from the EPR (electronic patient record), RIS (Radiology Information System) and PACS (Picture Archiving and Communication System). Results and Discussion: Cases including (1) Screening of the contralateral breast in patient with a new breast malignancy (2) Breast augmentation with free injection of unknown foreign materials (3) Finding of axillary adenopathy with an unknown site of primary malignancy (4) Neo-adjuvant chemotherapy: before, during, and after chemotherapy to evaluate treatment response and extent of residual disease prior to operation. Relevant images will be included and illustrated in the presentation. As with other types of MR imaging, there are different artifacts and pitfalls that can potentially limit interpretation of the images. Because of the coils and software specific to breast MR imaging, there are some other technical considerations that are unique to MR imaging of breast regions. Case demonstration images will be available in presentation. Conclusion: Breast MR imaging is a highly sensitive and reasonably specific method for the detection of breast cancer. Adherent to appropriate clinical indications and technical optimization are crucial for achieving satisfactory images for interpretation.Keywords: MRI, breast, clinical, cancer
Procedia PDF Downloads 24115080 Nationalization of the Social Life in Argentina: Accumulation of Capital, State Intervention, Labor Market, and System of Rights in the Last Decades
Authors: Mauro Cristeche
Abstract:
This work begins with a very simple question: How does the State spend? Argentina is witnessing a process of growing nationalization of social life, so it is necessary to find out the explanations of the phenomenon on the specific dynamic of the capitalist mode of production in Argentina and its transformations in the last decades. Then the new question is: what happened in Argentina that could explain this phenomenon? Since the seventies, the capital growth in Argentina faces deep competitive problems. Until that moment the agrarian wealth had worked as a compensation mechanism, but it began to find its limits. In the meantime, some important demographical and structural changes had happened. The strategy of the capitalist class had to become to seek in the cheapness of the labor force the main source of compensation of its weakness. As a result, a tendency to worsen the living conditions and fragmentation of the working class started to develop, manifested by unemployment, underemployment, and the fall of the purchasing power of the salary as a highlighted fact. As a consequence, it is suggested that the role of the State became stronger and public expenditure increased, as a historical trend, because it has to intervene to face the contradictions and constant growth problems posed by the development of capitalism in Argentina. On the one hand, the State has to guarantee the process of buying the cheapened workforce and at the same time the process of reproduction of the working class. On the other hand, it has to help to reproduce the individual capitals but needs to ‘attack’ them in different ways. This is why the role of the State is said to be the general political representative to the national portion of the total social capital. What will be studied is the dynamic of the intervention of the Argentine State in the context of the particular national process of capital growth, and its dynamics in the last decades. What this paper wants to show are the main general causes that could explain the phenomenon of nationalization of the social life and how it has impacted the life conditions of the working class and the system of rights.Keywords: Argentina, nationalization, public policies, rights, state
Procedia PDF Downloads 13615079 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 26015078 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems
Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas
Abstract:
This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.Keywords: transportation networks, freight delivery, data flow, monitoring, e-services
Procedia PDF Downloads 12615077 ANA Negative but FANA Positive Patients with Clinical Symptoms of Rheumatic Disease: The Suggestion for Clinicians
Authors: Abdolreza Esmaeilzadeh, Mehri Mirzaei
Abstract:
Objective: Rheumatic disease is a chronic disease that causes pain, stiffness, swelling and limited motion and function of many joints. RA is the most common form of autoimmune arthritis, affecting more than 1.3 million Americans. Of these, about 75% are women. Materials and Methods: This study was formed due to the misconception about ANA test, which is frequently performed with methods based upon solid phase as ELISA. This experiment was conducted on 430 patients, with clinical symptoms that are likely affected with rheumatic diseases, simultaneously by means of ANA and FANA. Results: 36 cases (8.37%) of patients, despite positive ANA, have demonstrated negative results via Indirect Immunofluorescence Assay (IIFA), (false positive). 116 cases (27%) have demonstrated negative ANA results, by means of the ELISA technique, although they had positive IIFA results. Conclusion: Other advantages of IIFA are antibody titration and specific pattern detection that have the capability of distinguishing positive dsDNA results. According to the restrictions and false negative cases, in patients, IIFA test is highly recommended for these disease's diagnosis.Keywords: autoimmune disease, IIFA, EIA, rheumatic disease
Procedia PDF Downloads 49915076 Human Action Retrieval System Using Features Weight Updating Based Relevance Feedback Approach
Authors: Munaf Rashid
Abstract:
For content-based human action retrieval systems, search accuracy is often inferior because of the following two reasons 1) global information pertaining to videos is totally ignored, only low level motion descriptors are considered as a significant feature to match the similarity between query and database videos, and 2) the semantic gap between the high level user concept and low level visual features. Hence, in this paper, we propose a method that will address these two issues and in doing so, this paper contributes in two ways. Firstly, we introduce a method that uses both global and local information in one framework for an action retrieval task. Secondly, to minimize the semantic gap, a user concept is involved by incorporating features weight updating (FWU) Relevance Feedback (RF) approach. We use statistical characteristics to dynamically update weights of the feature descriptors so that after every RF iteration feature space is modified accordingly. For testing and validation purpose two human action recognition datasets have been utilized, namely Weizmann and UCF. Results show that even with a number of visual challenges the proposed approach performs well.Keywords: relevance feedback (RF), action retrieval, semantic gap, feature descriptor, codebook
Procedia PDF Downloads 47515075 Displacement Based Design of a Dual Structural System
Authors: Romel Cordova Shedan
Abstract:
The traditional seismic design is the methodology of Forced Based Design (FBD). The Displacement Based Design (DBD) is a seismic design that considers structural damage to achieve a failure mechanism of the structure before the collapse. It is easier to quantify damage of a structure with displacements rather than forces. Therefore, a structure to achieve an inelastic displacement design with good ductility, it is necessary to be damaged. The first part of this investigation is about differences between the methodologies of DBD and FBD with some DBD advantages. In the second part, there is a study case about a dual building 5-story, which is regular in plan and elevation. The building is located in a seismic zone, which acceleration in firm soil is 45% of the acceleration of gravity. Then it is applied both methodologies into the study case to compare its displacements, shear forces and overturning moments. In the third part, the Dynamic Time History Analysis (DTHA) is done, to compare displacements with DBD and FBD methodologies. Three accelerograms were used and the magnitude of the acceleration scaled to be spectrum compatible with design spectrum. Then, using ASCE 41-13 guidelines, the hinge plastics were assigned to structure. Finally, both methodologies results about study case are compared. It is important to take into account that the seismic performance level of the building for DBD is greater than FBD method. This is due to drifts of DBD are in the order of 2.0% and 2.5% comparing with FBD drifts of 0.7%. Therefore, displacements of DBD is greater than the FBD method. Shear forces of DBD result greater than FBD methodology. These strengths of DBD method ensures that structure achieves design inelastic displacements, because those strengths were obtained due to a displacement spectrum reduction factor which depends on damping and ductility of the dual system. Also, the displacements for the study case for DBD results to be greater than FBD and DTHA. In that way, it proves that the seismic performance level of the building for DBD is greater than FBD method. Due to drifts of DBD which are in the order of 2.0% and 2.5% compared with little FBD drifts of 0.7%.Keywords: displacement-based design, displacement spectrum reduction factor, dynamic time history analysis, forced based design
Procedia PDF Downloads 22915074 An Approach to Improve Pre University Students' Responsible Environmental Behaviour through Science Writing Heuristic in Malaysia
Authors: Sheila Shamuganathan, Mageswary Karpudewan
Abstract:
This study investigated the effectiveness of green chemistry integrated with Science Writing Heuristic (SWH) in enhancing matriculation students’ responsible environmental behaviour. For this purpose 207 matriculation students were randomly assigned into experimental (N=118) and control (N=89) group. For the experimental group the chemistry concepts were taught using the instructional approach of green chemistry integrated with Science Writing Heuristic (SWH) while for the control group the same content was taught using green chemistry. The data was analysed using ANCOVA and findings obtained from the quantitative analysis reveals that there is significant changes in responsible environmental behaviour (F 1,204) = 32.13 (ηp² = 0.14) which favours the experimental group. The responses of the qualitative data obtained from an interview with the experimental group also further strengthen and indicated a significant improvement in responsible environmental behaviour. The outcome of the study suggests that using green chemistry integrated with Science Writing Heuristic (SWH) could be an alternative approach to improve students’ responsible environmental behaviour towards the environment.Keywords: science writing heuristic, green chemistry, pro environmental behaviour, laboratory
Procedia PDF Downloads 31715073 The Effect of Manure Loaded Biochar on Soil Microbial Communities
Authors: T. Weber, D. MacKenzie
Abstract:
The script in this paper describes the use of advanced simulation environment using electronic systems (microcontroller, operational amplifiers, and FPGA). The simulation was used for non-linear dynamic systems behaviour with required observer structure working with parallel real-time simulation based on state-space representation. The proposed deposited model was used for electrodynamic effects including ionising effects and eddy current distribution also. With the script and proposed method, it is possible to calculate the spatial distribution of the electromagnetic fields in real-time and such systems. For further purpose, the spatial temperature distribution may also be used. With upon system, the uncertainties and disturbances may be determined. This provides the estimation of the more precise system states for the required system and additionally the estimation of the ionising disturbances that arise due to radiation effects in space systems. The results have also shown that a system can be developed specifically with the real-time calculation (estimation) of the radiation effects only. Electronic systems can take damage caused by impacts with charged particle flux in space or radiation environment. TID (Total Ionising Dose) of 1 Gy and Single Effect Transient (SET) free operation up to 50 MeVcm²/mg may assure certain functions. Single-Event Latch-up (SEL) results on the placement of several transistors in the shared substrate of an integrated circuit; ionising radiation can activate an additional parasitic thyristor. This short circuit between semiconductor-elements can destroy the device without protection and measurements. Single-Event Burnout (SEB) on the other hand, increases current between drain and source of a MOSFET and destroys the component in a short time. A Single-Event Gate Rupture (SEGR) can destroy a dielectric of semiconductor also. In order to be able to react to these processes, it must be calculated within a shorter time that ionizing radiation and dose is present. For this purpose, sensors may be used for the realistic evaluation of the diffusion and ionizing effects of the test system. For this purpose, the Peltier element is used for the evaluation of the dynamic temperature increases (dT/dt), from which a measure of the ionization processes and thus radiation will be detected. In addition, the piezo element may be used to record highly dynamic vibrations and oscillations to absorb impacts of charged particle flux. All available sensors shall be used to calibrate the spatial distributions also. By measured value of size and known location of the sensors, the entire distribution in space can be calculated retroactively or more accurately. With the formation, the type of ionisation and the direct effect to the systems and thus possible prevent processes can be activated up to the shutdown. The results show possibilities to perform more qualitative and faster simulations independent of space-systems and radiation environment also. The paper gives additionally an overview of the diffusion effects and their mechanisms.Keywords: cattle, biochar, manure, microbial activity
Procedia PDF Downloads 10315072 The Optimization of Immobilization Conditions for Biohydrogen Production from Palm Industry Wastewater
Authors: A. W. Zularisam, Sveta Thakur, Lakhveer Singh, Mimi Sakinah Abdul Munaim
Abstract:
Clostridium sp. LS2 was immobilised by entrapment in polyethylene glycol (PEG) gel beads to improve the biohydrogen production rate from palm oil mill effluent (POME). We sought to explore and optimise the hydrogen production capability of the immobilised cells by studying the conditions for cell immobilisation, including PEG concentration, cell loading and curing times, as well as the effects of temperature and K2HPO4 (500–2000 mg/L), NiCl2 (0.1–5.0 mg/L), FeCl2 (100–400 mg/L) MgSO4 (50–200 mg/L) concentrations on hydrogen production rate. The results showed that by optimising the PEG concentration (10% w/v), initial biomass (2.2 g dry weight), curing time (80 min) and temperature (37 °C), as well as the concentrations of K2HPO4 (2000 mg/L), NiCl2 (1 mg/L), FeCl2 (300 mg/L) and MgSO4 (100 mg/L), a maximum hydrogen production rate of 7.3 L/L-POME/day and a yield of 0.31 L H2/g chemical oxygen demand were obtained during continuous operation. We believe that this process may be potentially expanded for sustained and large-scale hydrogen production.Keywords: hydrogen, polyethylene glycol, immobilised cell, fermentation, palm oil mill effluent
Procedia PDF Downloads 27215071 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)
Authors: Gule Teri
Abstract:
The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing
Procedia PDF Downloads 8015070 Integrating Data Mining within a Strategic Knowledge Management Framework: A Platform for Sustainable Competitive Advantage within the Australian Minerals and Metals Mining Sector
Authors: Sanaz Moayer, Fang Huang, Scott Gardner
Abstract:
In the highly leveraged business world of today, an organisation’s success depends on how it can manage and organize its traditional and intangible assets. In the knowledge-based economy, knowledge as a valuable asset gives enduring capability to firms competing in rapidly shifting global markets. It can be argued that ability to create unique knowledge assets by configuring ICT and human capabilities, will be a defining factor for international competitive advantage in the mid-21st century. The concept of KM is recognized in the strategy literature, and increasingly by senior decision-makers (particularly in large firms which can achieve scalable benefits), as an important vehicle for stimulating innovation and organisational performance in the knowledge economy. This thinking has been evident in professional services and other knowledge intensive industries for over a decade. It highlights the importance of social capital and the value of the intellectual capital embedded in social and professional networks, complementing the traditional focus on creation of intellectual property assets. Despite the growing interest in KM within professional services there has been limited discussion in relation to multinational resource based industries such as mining and petroleum where the focus has been principally on global portfolio optimization with economies of scale, process efficiencies and cost reduction. The Australian minerals and metals mining industry, although traditionally viewed as capital intensive, employs a significant number of knowledge workers notably- engineers, geologists, highly skilled technicians, legal, finance, accounting, ICT and contracts specialists working in projects or functions, representing potential knowledge silos within the organisation. This silo effect arguably inhibits knowledge sharing and retention by disaggregating corporate memory, with increased operational and project continuity risk. It also may limit the potential for process, product, and service innovation. In this paper the strategic application of knowledge management incorporating contemporary ICT platforms and data mining practices is explored as an important enabler for knowledge discovery, reduction of risk, and retention of corporate knowledge in resource based industries. With reference to the relevant strategy, management, and information systems literature, this paper highlights possible connections (currently undergoing empirical testing), between an Strategic Knowledge Management (SKM) framework incorporating supportive Data Mining (DM) practices and competitive advantage for multinational firms operating within the Australian resource sector. We also propose based on a review of the relevant literature that more effective management of soft and hard systems knowledge is crucial for major Australian firms in all sectors seeking to improve organisational performance through the human and technological capability captured in organisational networks.Keywords: competitive advantage, data mining, mining organisation, strategic knowledge management
Procedia PDF Downloads 41515069 A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry
Authors: Zeynep Sener, Mehtap Dursun
Abstract:
Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision methodology.Keywords: fuzzy decision making, fuzzy multiple objective programming, medical supply chain, supplier selection
Procedia PDF Downloads 45115068 Environmental Radioactivity Analysis by a Sequential Approach
Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab
Abstract:
Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method
Procedia PDF Downloads 49515067 Sustainable Manufacturing Industries and Energy-Water Nexus Approach
Authors: Shahbaz Abbas, Lin Han Chiang Hsieh
Abstract:
The significant population growth and climate change issues have contributed to the natural resources depletion and their sustainability in the future. Manufacturing industries have a substantial impact on every country’s economy, but the sustainability of the industrial resources is challenging, and the policymakers have been developing the possible solutions to manage the sustainability of industrial resources such as raw material, energy, water, and industrial supply chain. In order to address these challenges, nexus approach is one of the optimization and modelling techniques in the recent sustainable environmental research. The interactions between the nexus components acknowledge that all components are dependent upon each other, and they are interrelated; therefore, their sustainability is also associated with each other. In addition, the nexus concept does not only provide the resources sustainability but also environmental sustainability can be achieved through nexus approach by utilizing the industrial waste as a resource for the industrial processes. Based on energy-water nexus, this study has developed a resource-energy-water for the sugar industry to understand the interactions between sugarcane, energy, and water towards the sustainable sugar industry. In particular, the focus of the research is the Taiwanese sugar industry; however, the same approach can be adapted worldwide to optimize the sustainability of sugar industries. It has been concluded that there are significant interactions between sugarcane, energy consumption, and water consumption in the sugar industry to manage the scarcity of resources in the future. The interactions between sugarcane and energy also deliver a mechanism to reuse the sugar industrial waste as a source of energy, consequently validating industrial and environmental sustainability. The desired outcomes from the nexus can be achieved with the modifications in the policy and regulations of Taiwanese industrial sector.Keywords: energy-water nexus, environmental sustainability, industrial sustainability, natural resource management
Procedia PDF Downloads 125