Search results for: Automated Rack Supported Warehouse
2587 The Predictability of Three Implants to Support a Fixed Prosthesis in the Edentulous Mandible
Authors: M. Hirani, M. Devine, O. Obisesan, C. Bryant
Abstract:
Introduction: The use of four or more implants to support a fixed prosthesis in the edentulous mandible is well documented, with high levels of clinical outcomes recorded. Despite this, the use of three implant-supported fixed prostheses offers the potential to deliver a more cost-effective method of oral rehabilitation in the lower arch, an important consideration given that edentulism is most prevalent in low-income subpopulations. The purpose of this study aimed to evaluate the implant and prosthetic survival rate, changes in marginal bone level, and patient satisfaction associated with a three-implant-supported fixed prosthesis for rehabilitation of the edentulous mandible over a follow-up period of at least one year. Methods: A comprehensive literature search was performed to evaluate studies that met the selection criteria. The information extracted included the study design and population, participant demographics, observation period, loading protocol, and the number of implants placed together with the required outcome measures. Mean values and standard deviations (SD) were calculated using SPSS® (IBM Corporation, New York, USA), and the level of statistical significance across all comparative studies described was set at P < 0.05. Results: The eligible studies included a total of 1968 implants that were placed in 652 patients. The subjects ranged in age from 33-89 years, with a mean of 63.2 years. The mean cumulative implant and prosthetic survival rates were 95.5% and 96.2%, respectively, over a mean follow-up period of 3.25 years. The mean marginal bone loss recorded was 1.04 mm, and high patient satisfaction rates were reported across the studies. Conclusion: Current evidence suggests that a three implant-supported fixed prosthesis for the edentulous mandible is a successful treatment strategy presenting high implant and prosthetic survival rates over the short-to-medium term. Further well-designed controlled clinical trials are required to evaluate longer-term outcomes, with supplemental data correlating implant dimensions and prosthetic design.Keywords: implants, mandible, fixed, prosthesis
Procedia PDF Downloads 1312586 2D Convolutional Networks for Automatic Segmentation of Knee Cartilage in 3D MRI
Authors: Ananya Ananya, Karthik Rao
Abstract:
Accurate segmentation of knee cartilage in 3-D magnetic resonance (MR) images for quantitative assessment of volume is crucial for studying and diagnosing osteoarthritis (OA) of the knee, one of the major causes of disability in elderly people. Radiologists generally perform this task in slice-by-slice manner taking 15-20 minutes per 3D image, and lead to high inter and intra observer variability. Hence automatic methods for knee cartilage segmentation are desirable and are an active field of research. This paper presents design and experimental evaluation of 2D convolutional neural networks based fully automated methods for knee cartilage segmentation in 3D MRI. The architectures are validated based on 40 test images and 60 training images from SKI10 dataset. The proposed methods segment 2D slices one by one, which are then combined to give segmentation for whole 3D images. Proposed methods are modified versions of U-net and dilated convolutions, consisting of a single step that segments the given image to 5 labels: background, femoral cartilage, tibia cartilage, femoral bone and tibia bone; cartilages being the primary components of interest. U-net consists of a contracting path and an expanding path, to capture context and localization respectively. Dilated convolutions lead to an exponential expansion of receptive field with only a linear increase in a number of parameters. A combination of modified U-net and dilated convolutions has also been explored. These architectures segment one 3D image in 8 – 10 seconds giving average volumetric Dice Score Coefficients (DSC) of 0.950 - 0.962 for femoral cartilage and 0.951 - 0.966 for tibia cartilage, reference being the manual segmentation.Keywords: convolutional neural networks, dilated convolutions, 3 dimensional, fully automated, knee cartilage, MRI, segmentation, U-net
Procedia PDF Downloads 2612585 Cost-Optimized Extra-Lateral Transshipments
Authors: Dilupa Nakandala, Henry Lau
Abstract:
Ever increasing demand for cost efficiency and customer satisfaction through reliable delivery have been a mandate for logistics practitioners to continually improve inventory management processes. With the cost optimization objectives, this study considers an extended scenario where sourcing from the same echelon of the supply chain, known as lateral transshipment which is instantaneous but more expensive than purchasing from regular suppliers, is considered by warehouses not only to re-actively fulfill the urgent outstanding retailer demand that could not be fulfilled by stock on hand but also for preventively reduce back-order cost. Such extra lateral trans-shipments as preventive responses are intended to meet the expected demand during the supplier lead time in a periodic review ordering policy setting. We develop decision rules to assist logistics practitioners to make cost optimized selection between back-ordering and combined reactive and proactive lateral transshipment options. A method for determining the optimal quantity of extra lateral transshipment is developed considering the trade-off between purchasing, holding and backorder cost components.Keywords: lateral transshipment, warehouse inventory management, cost optimization, preventive transshipment
Procedia PDF Downloads 6162584 Innovative Technologies for Aeration and Feeding of Fish in Aquaculture with Minimal Impact on the Environment
Authors: Vasile Caunii, Andreea D. Serban, Mihaela Ivancia
Abstract:
The paper presents a new approach in terms of the circular economy of technologies for feeding and aeration of accumulations and water basins for fish farming and aquaculture. Because fish is and will be one of the main foods on the planet, the use of bio-eco-technologies is a priority for all producers. The technologies proposed in the paper want to reduce by a substantial percentage the costs of operation of ponds and water accumulation, using non-polluting technologies with minimal impact on the environment. The paper proposes two innovative, intelligent systems, fully automated that use a common platform, completely eco-friendly. One system is intended to aerate the water of the fish pond, and the second is intended to feed the fish by dispersing an optimal amount of fodder, depending on population size, age and habits. Both systems use a floating platform, regenerative energy sources, are equipped with intelligent and innovative systems, and in addition to fully automated operation, significantly reduce the costs of aerating water accumulations (natural or artificial) and feeding fish. The intelligent system used for feeding, in addition, to reduce operating costs, optimizes the amount of food, thus preventing water pollution and the development of bacteria, microorganisms. The advantages of the systems are: increasing the yield of fish production, these are green installations, with zero pollutant emissions, can be arranged anywhere on the water surface, depending on the user's needs, can operate autonomously or remotely controlled, if there is a component failure, the system provides the operator with accurate data on the issue, significantly reducing maintenance costs, transmit data about the water physical and chemical parameters.Keywords: bio-eco-technologies, economy, environment, fish
Procedia PDF Downloads 1502583 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects
Authors: Mai Ghazal, Ahmed Hammad
Abstract:
Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management
Procedia PDF Downloads 3702582 Effect of Leaks in Solid Oxide Electrolysis Cells Tested for Durability under Co-Electrolysis Conditions
Authors: Megha Rao, Søren H. Jensen, Xiufu Sun, Anke Hagen, Mogens B. Mogensen
Abstract:
Solid oxide electrolysis cells have an immense potential in converting CO2 and H2O into syngas during co-electrolysis operation. The produced syngas can be further converted into hydrocarbons. This kind of technology is called power-to-gas or power-to-liquid. To produce hydrocarbons via this route, durability of the cells is still a challenge, which needs to be further investigated in order to improve the cells. In this work, various nickel-yttria stabilized zirconia (Ni-YSZ) fuel electrode supported or YSZ electrolyte supported cells, cerium gadolinium oxide (CGO) barrier layer, and an oxygen electrode are investigated for durability under co-electrolysis conditions in both galvanostatic and potentiostatic conditions. While changing the gas on the oxygen electrode, keeping the fuel electrode gas composition constant, a change in the gas concentration arc was observed by impedance spectroscopy. Measurements of open circuit potential revealed the presence of leaks in the setup. It is speculated that the change in concentration impedance may be related to the leaks. Furthermore, the cells were also tested under pressurized conditions to find an inter-play between the leak rate and the pressure. A mathematical modeling together with electrochemical and microscopy analysis is presented.Keywords: co-electrolysis, durability, leaks, gas concentration arc
Procedia PDF Downloads 1482581 Robust Segmentation of Salient Features in Automatic Breast Ultrasound (ABUS) Images
Authors: Lamees Nasser, Yago Diez, Robert Martí, Joan Martí, Ibrahim Sadek
Abstract:
Automated 3D breast ultrasound (ABUS) screening is a novel modality in medical imaging because of its common characteristics shared with other ultrasound modalities in addition to the three orthogonal planes (i.e., axial, sagittal, and coronal) that are useful in analysis of tumors. In the literature, few automatic approaches exist for typical tasks such as segmentation or registration. In this work, we deal with two problems concerning ABUS images: nipple and rib detection. Nipple and ribs are the most visible and salient features in ABUS images. Determining the nipple position plays a key role in some applications for example evaluation of registration results or lesion follow-up. We present a nipple detection algorithm based on color and shape of the nipple, besides an automatic approach to detect the ribs. In point of fact, rib detection is considered as one of the main stages in chest wall segmentation. This approach consists of four steps. First, images are normalized in order to minimize the intensity variability for a given set of regions within the same image or a set of images. Second, the normalized images are smoothed by using anisotropic diffusion filter. Next, the ribs are detected in each slice by analyzing the eigenvalues of the 3D Hessian matrix. Finally, a breast mask and a probability map of regions detected as ribs are used to remove false positives (FP). Qualitative and quantitative evaluation obtained from a total of 22 cases is performed. For all cases, the average and standard deviation of the root mean square error (RMSE) between manually annotated points placed on the rib surface and detected points on rib borders are 15.1188 mm and 14.7184 mm respectively.Keywords: Automated 3D Breast Ultrasound, Eigenvalues of Hessian matrix, Nipple detection, Rib detection
Procedia PDF Downloads 3302580 STML: Service Type-Checking Markup Language for Services of Web Components
Authors: Saqib Rasool, Adnan N. Mian
Abstract:
Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.Keywords: REST, STML, type checking, web component
Procedia PDF Downloads 2542579 Improving Performance of K₂CO₃ Sorbent Using Core/Shell Alumina-Based Supports in a Multicycle CO₂ Capture Process
Authors: S. Toufigh Bararpour, Amir H. Soleimanisalim, Davood Karami, Nader Mahinpey
Abstract:
The continued increase in the atmospheric concentration of CO2 is expected to have great impacts on the climate. In order to reduce CO2 emission to the atmosphere, an efficient and cost-effective technique is required. Using regenerable solid sorbents, especially K2CO3 is a promising method for low-temperature CO2 capture. Pure K2CO3 is a delinquent substance that requires modifications before it can be used for cyclic operations. For this purpose, various types of additives and supports have been used to improve the structure of K2CO3. However, hydrophilicity and reactivity of the support materials with K2CO3 have a negative effect on the CO2 capture capacity of the sorbents. In this research, two kinds of alumina supports (γ-Alumina and Boehmite) were used. In order to decrease the supports' hydrophilicity and reactivity with K2CO3, nonreactive additives such as Titania, Zirconia and Silisium were incorporated into their structures. These materials provide a shell around the alumina to protect it from undesirable reactions and improve its properties. K2CO3-based core/shell-supported sorbents were fabricated using two preparation steps. The sol-gel method was applied for shelling the supports. Then the shelled supports were impregnated on K2CO3. The physicochemical properties of the sorbents were determined using SEM and BET analyses, and their CO2 capture capacity was quantified using a thermogravimetric analyzer. It was shown that type of the shell's material had an important effect on the water adsorption capacity of the sorbents. Supported K2CO3 modified by Titania shell showed the lowest hydrophilicity among the prepared samples. Based on the obtained results, incorporating nonreactive additives in Boehmite had an outstanding impact on the CO2 capture performance of the sorbent. Incorporation of Titania into the Boehmite-supported K2CO3 enhanced its CO2 capture capacity significantly. Therefore, further study of this novel fabrication technique is highly recommended. In the second phase of this research project, the CO2 capture performance of the sorbents in fixed and fluidized bed reactors will be investigated.Keywords: CO₂ capture, core/shell support, K₂CO₃, post-combustion
Procedia PDF Downloads 1502578 The Effect of CPU Location in Total Immersion of Microelectronics
Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson
Abstract:
Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures
Procedia PDF Downloads 2722577 Case Study: Hybrid Mechanically Stabilized Earth Wall System Built on Basal Reinforced Raft
Authors: S. Kaymakçı, D. Gündoğdu, H. Özçelik
Abstract:
The truck park of a warehouse for a chain of supermarket was going to be constructed on a poor ground. Rather than using a piled foundation, the client was convinced that a ground improvement using a reinforced foundation raft also known as “basal reinforcement” shall work. The retaining structures supporting the truck park area were designed using a hybrid structure made up of the Terramesh® Wall System and MacGrid™ high strength geogrids. The total wall surface area is nearly 2740 sq.m , reaching a maximum height of 13.00 meters. The area is located in the first degree seismic zone of Turkey and the design seismic acceleration is high. The design of walls has been carried out using pseudo-static method (limit equilibrium) taking into consideration different loading conditions using Eurocode 7. For each standard approach stability analysis in seismic condition were performed. The paper presents the detailed design of the reinforced soil structure, basal reinforcement and the construction methods; advantages of using such system for the project are discussed.Keywords: basal reinforcement, geogrid, reinforced soil raft, reinforced soil wall, soil reinforcement
Procedia PDF Downloads 3032576 External Store Safe Separation Evaluation Process Implementing CFD and MIL-HDBK-1763
Authors: Thien Bach Nguyen, Nhu-Van Nguyen, Phi-Minh Nguyen, Minh Hien Dao
Abstract:
The external store safe separation evaluation process implementing CFD and MIL-HDBK-1763 is proposed to support the evaluation and compliance of the external store safe separation with the extensive using CFD and the criteria from MIL-HDBK-1763. The criteria of safe separation are researched and investigated for the various standards and handbooks such as MIL-HDBK-1763, MIL-HDBK-244A, AGARD-AG-202 and AGARD-AG-300 to acquire the appropriate and tailored values and limits for the typical applications of external carriages and aircraft fighters. The CFD and 6DOF simulations are extensively used in ANSYS 2023 R1 Software for verification and validation of moving unstructured meshes and solvers by calibrating the position, aerodynamic forces and moments of the existing air-to-ground missile models. The verified CFD and 6DoF simulation separation process is applied and implemented for the investigation of the typical munition separation phenomena and compliance with the tailored requirements of MIL-HDBK-1763. The prediction of munition trajectory parameters under aircraft aerodynamics interference and specified rack unit consideration after munition separation is provided and complied with the tailored requirements to support the safe separation evaluation of improved and newly external store munition before the flight test performed. The proposed process demonstrates the effectiveness and reliability in providing the understanding of the complicated store separation and the reduction of flight test sorties during the improved and new munition development projects by extensively using the CFD and tailoring the existing standards.Keywords: external store separation, MIL-HDBK-1763, CFD, moving meshes, flight test data, munition.
Procedia PDF Downloads 242575 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 5532574 Design of Mobile Teaching for Students Collaborative Learning in Distance Higher Education
Authors: Lisbeth Amhag
Abstract:
The aim of the study is to describe and analyze the design of mobile teaching for students collaborative learning in distance higher education with a focus on mobile technologies as online webinars (web-based seminars or conferencing) by using laptops, smart phones, or tablets. These multimedia tools can provide face-to-face interactions, recorded flipped classroom videos and parallel chat communications. The data collection consists of interviews with 22 students and observations of online face-to-face webinars, as well two surveys. Theoretically, the study joins the research tradition of Computer Supported Collaborative learning, CSCL, as well as Computer Self-Efficacy, CSE concerned with individuals’ media and information literacy. Important conclusions from the study demonstrated mobile interactions increased student centered learning. As the students were appreciating the working methods, they became more engaged and motivated. The mobile technology using among student also contributes to increased flexibility between space and place, as well as media and information literacy.Keywords: computer self-efficacy, computer supported collaborative learning, distance and open learning, educational design and technologies, media and information literacy, mobile learning
Procedia PDF Downloads 3582573 Positive Psychology and the Social Emotional Ability Instrument (SEAI)
Authors: Victor William Harris
Abstract:
This research is a validation study of the Social Emotional Ability Inventory (SEAI), a multi-dimensional self-report instrument informed by positive psychology, emotional intelligence, social intelligence, and sociocultural learning theory. Designed for use in tandem with the Social Emotional Development (SEAD) theoretical model, the SEAI provides diagnostic-level guidance for professionals and individuals interested in investigating, identifying, and understanding social, emotional strengths, as well as remediating specific social competency deficiencies. The SEAI was shown to be psychometrically sound, exhibited strong internal reliability, and supported the a priori hypotheses of the SEAD. Additionally, confirmatory factor analysis provided evidence of goodness of fit, convergent and divergent validity, and supported a theoretical model that reflected SEAD expectations. The SEAI and SEAD hold potentially far-reaching and important practical implications for theoretical guidance and diagnostic-level measurement of social, emotional competency across a wide range of domains. Strategies researchers, practitioners, educators, and individuals might use to deploy SEAI in order to improve quality of life outcomes are discussed.Keywords: emotion, emotional ability, positive psychology-social emotional ability, social emotional ability, social emotional ability instrument
Procedia PDF Downloads 2562572 Immature Palm Tree Detection Using Morphological Filter for Palm Counting with High Resolution Satellite Image
Authors: Nur Nadhirah Rusyda Rosnan, Nursuhaili Najwa Masrol, Nurul Fatiha MD Nor, Mohammad Zafrullah Mohammad Salim, Sim Choon Cheak
Abstract:
Accurate inventories of oil palm planted areas are crucial for plantation management as this would impact the overall economy and production of oil. One of the technological advancements in the oil palm industry is semi-automated palm counting, which is replacing conventional manual palm counting via digitizing aerial imagery. Most of the semi-automated palm counting method that has been developed was limited to mature palms due to their ideal canopy size represented by satellite image. Therefore, immature palms were often left out since the size of the canopy is barely visible from satellite images. In this paper, an approach using a morphological filter and high-resolution satellite image is proposed to detect immature palm trees. This approach makes it possible to count the number of immature oil palm trees. The method begins with an erosion filter with an appropriate window size of 3m onto the high-resolution satellite image. The eroded image was further segmented using watershed segmentation to delineate immature palm tree regions. Then, local minimum detection was used because it is hypothesized that immature oil palm trees are located at the local minimum within an oil palm field setting in a grayscale image. The detection points generated from the local minimum are displaced to the center of the immature oil palm region and thinned. Only one detection point is left that represents a tree. The performance of the proposed method was evaluated on three subsets with slopes ranging from 0 to 20° and different planting designs, i.e., straight and terrace. The proposed method was able to achieve up to more than 90% accuracy when compared with the ground truth, with an overall F-measure score of up to 0.91.Keywords: immature palm count, oil palm, precision agriculture, remote sensing
Procedia PDF Downloads 762571 Comfort Sensor Using Fuzzy Logic and Arduino
Authors: Samuel John, S. Sharanya
Abstract:
Automation has become an important part of our life. It has been used to control home entertainment systems, changing the ambience of rooms for different events etc. One of the main parameters to control in a smart home is the atmospheric comfort. Atmospheric comfort mainly includes temperature and relative humidity. In homes, the desired temperature of different rooms varies from 20 °C to 25 °C and relative humidity is around 50%. However, it varies widely. Hence, automated measurement of these parameters to ensure comfort assumes significance. To achieve this, a fuzzy logic controller using Arduino was developed using MATLAB. Arduino is an open source hardware consisting of a 24 pin ATMEGA chip (atmega328), 14 digital input /output pins and an inbuilt ADC. It runs on 5v and 3.3v power supported by a board voltage regulator. Some of the digital pins in Aruduino provide PWM (pulse width modulation) signals, which can be used in different applications. The Arduino platform provides an integrated development environment, which includes support for c, c++ and java programming languages. In the present work, soft sensor was introduced in this system that can indirectly measure temperature and humidity and can be used for processing several measurements these to ensure comfort. The Sugeno method (output variables are functions or singleton/constant, more suitable for implementing on microcontrollers) was used in the soft sensor in MATLAB and then interfaced to the Arduino, which is again interfaced to the temperature and humidity sensor DHT11. The temperature-humidity sensor DHT11 acts as the sensing element in this system. Further, a capacitive humidity sensor and a thermistor were also used to support the measurement of temperature and relative humidity of the surrounding to provide a digital signal on the data pin. The comfort sensor developed was able to measure temperature and relative humidity correctly. The comfort percentage was calculated and accordingly the temperature in the room was controlled. This system was placed in different rooms of the house to ensure that it modifies the comfort values depending on temperature and relative humidity of the environment. Compared to the existing comfort control sensors, this system was found to provide an accurate comfort percentage. Depending on the comfort percentage, the air conditioners and the coolers in the room were controlled. The main highlight of the project is its cost efficiency.Keywords: arduino, DHT11, soft sensor, sugeno
Procedia PDF Downloads 3122570 A Philosophical Investigation into African Conceptions of Personhood in the Fourth Industrial Revolution
Authors: Sanelisiwe Ndlovu
Abstract:
Cities have become testbeds for automation and experimenting with artificial intelligence (AI) in managing urban services and public spaces. Smart Cities and AI systems are changing most human experiences from health and education to personal relations. For instance, in healthcare, social robots are being implemented as tools to assist patients. Similarly, in education, social robots are being used as tutors or co-learners to promote cognitive and affective outcomes. With that general picture in mind, one can now ask a further question about Smart Cities and artificial agents and their moral standing in the African context of personhood. There has been a wealth of literature on the topic of personhood; however, there is an absence of literature on African personhood in highly automated environments. Personhood in African philosophy is defined by the role one can and should play in the community. However, in today’s technologically advanced world, a risk is that machines become more capable of accomplishing tasks that humans would otherwise do. Further, on many African communitarian accounts, personhood and moral standing are associated with active relationality with the community. However, in the Smart City, human closeness is gradually diminishing. For instance, humans already do engage and identify with robotic entities, sometimes even romantically. The primary aim of this study is to investigate how African conceptions of personhood and community interact in a highly automated environment such as Smart Cities. Accordingly, this study lies in presenting a rarely discussed African perspective that emphasizes the necessity and the importance of relationality in handling Smart Cities and AI ethically. Thus, the proposed approach can be seen as the sub-Saharan African contribution to personhood and the growing AI debates, which takes the reality of the interconnectedness of society seriously. And it will also open up new opportunities to tackle old problems and use existing resources to confront new problems in the Fourth Industrial Revolution.Keywords: smart city, artificial intelligence, personhood, community
Procedia PDF Downloads 2022569 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods
Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino
Abstract:
In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer
Procedia PDF Downloads 3442568 Phenomena-Based Approach for Automated Generation of Process Options and Process Models
Authors: Parminder Kaur Heer, Alexei Lapkin
Abstract:
Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.Keywords: Phenomena, Process intensification, Process models , Process options
Procedia PDF Downloads 2322567 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan
Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad
Abstract:
Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules
Procedia PDF Downloads 1072566 Eosinopenia: Marker for Early Diagnosis of Enteric Fever
Authors: Swati Kapoor, Rajeev Upreti, Monica Mahajan, Abhaya Indrayan, Dinesh Srivastava
Abstract:
Enteric Fever is caused by gram negative bacilli Salmonella typhi and paratyphi. It is associated with high morbidity and mortality worldwide. Timely initiation of treatment is a crucial step for prevention of any complications. Cultures of body fluids are diagnostic, but not always conclusive or practically feasible in most centers. Moreover, the results of cultures delay the treatment initiation. Serological tests lack diagnostic value. The blood counts can offer a promising option in diagnosis. A retrospective study to find out the relevance of leucopenia and eosinopenia was conducted on 203 culture proven enteric fever patients and 159 culture proven non-enteric fever patients in a tertiary care hospital in New Delhi. The patient details were retrieved from the electronic medical records section of the hospital. Absolute eosinopenia was considered as absolute eosinophil count (AEC) of less than 40/mm³ (normal level: 40-400/mm³) using LH-750 Beckman Coulter Automated machine. Leucopoenia was defined as total leucocyte count (TLC) of less than 4 X 10⁹/l. Blood cultures were done using BacT/ALERT FA plus automated blood culture system before first antibiotic dose was given. Case and control groups were compared using Pearson Chi square test. It was observed that absolute eosinophil count (AEC) of 0-19/mm³ was a significant finding (p < 0.001) in enteric fever patients, whereas leucopenia was not a significant finding (p=0.096). Using Receiving Operating Characteristic (ROC) curves, it was observed that patients with both AEC < 14/mm³ and TCL < 8 x 10⁹/l had 95.6% chance of being diagnosed as enteric fever and only 4.4% chance of being diagnosed as non-enteric fever. This result was highly significant with p < 0.001. This is a very useful association of AEC and TLC found in enteric fever patients of this study which can be used for the early initiation of treatment in clinically suspected enteric fever patients.Keywords: absolute eosinopenia, absolute eosinophil count, enteric fever, leucopenia, total leucocyte count
Procedia PDF Downloads 1722565 The Legality of the Individual Education Plan from the Teachers’ Perspective in Saudi Arabia
Authors: Sohil I. Alqazlan
Abstract:
Introduction and Objectives: The individual educational plans (IEPs) is the cornerstone in education for students with special education need (SEN). The Saudi government supported the students’ right to have an IEP, and their education is one of the primary goals for the Ministry of Education (MoE). However, this support does not reflect the huge government investment. For example, some SEN students do not have an IEP, and poor communication was found between IEP teams and student's families. As a result, this study investigated perspectives and understandings of the IEP from the views of SEN teachers in the Saudi context. Methods: This study design utilised a qualitative approach, where in-depth semi-structured interviews were used with 8 SEN teachers in Riyadh (the capital city of Saudi Arabia) schools. In terms of analysing the interviews’ findings, the researcher used the thematic analyses approach. Results and Conclusion: The legality and the consideration of the legal document in Saudi Arabia are the main areas wherein study participants were questioned. It was observed that the IEP is not considered a legal document in the region of Saudi Arabia. As interpreted from the response of the SEN teachers, the IEP lacks the required legality with respect to its implementation in Saudi Arabia. All teachers were in agreement that the IEP is not considered to be a legal document in the Kingdom of Saudi Arabia. As a result, they did not use it for all their students with SEN. Such findings might have affected the teaching quality, and school outcomes as all SEN students must be supported individually depending on their needs.Keywords: individual education plan, special education, IEP, teachers
Procedia PDF Downloads 1712564 Transformer-Driven Multi-Category Classification for an Automated Academic Strand Recommendation Framework
Authors: Ma Cecilia Siva
Abstract:
This study introduces a Bidirectional Encoder Representations from Transformers (BERT)-based machine learning model aimed at improving educational counseling by automating the process of recommending academic strands for students. The framework is designed to streamline and enhance the strand selection process by analyzing students' profiles and suggesting suitable academic paths based on their interests, strengths, and goals. Data was gathered from a sample of 200 grade 10 students, which included personal essays and survey responses relevant to strand alignment. After thorough preprocessing, the text data was tokenized, label-encoded, and input into a fine-tuned BERT model set up for multi-label classification. The model was optimized for balanced accuracy and computational efficiency, featuring a multi-category classification layer with sigmoid activation for independent strand predictions. Performance metrics showed an F1 score of 88%, indicating a well-balanced model with precision at 80% and recall at 100%, demonstrating its effectiveness in providing reliable recommendations while reducing irrelevant strand suggestions. To facilitate practical use, the final deployment phase created a recommendation framework that processes new student data through the trained model and generates personalized academic strand suggestions. This automated recommendation system presents a scalable solution for academic guidance, potentially enhancing student satisfaction and alignment with educational objectives. The study's findings indicate that expanding the data set, integrating additional features, and refining the model iteratively could improve the framework's accuracy and broaden its applicability in various educational contexts.Keywords: tokenized, sigmoid activation, transformer, multi category classification
Procedia PDF Downloads 92563 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine
Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang
Abstract:
Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing
Procedia PDF Downloads 2042562 Radio Frequency Identification Chips in Colour Preference Tracking
Authors: A. Ballard
Abstract:
The ability to track goods and products en route in the delivery system, in the warehouse, and on the top floor is a huge advantage to shippers and retailers. Recently the emergence of radio frequency identification (RFID) technology has enabled this better than ever before. However, a significant problem exists in that RFID technology depends on the quality of the information stored for each tagged product. Because of the profusion of names for colours, it is very difficult to ascertain that stored values are recognised by all users who view the product visually. This paper reports the findings of a study in which 50 consumers and 50 logistics workers were shown colour swatches and asked to choose the name of the colour from a multiple choice list. They were then asked to match consumer products, including toasters, jumpers, and toothbrushes, with the identifying inventory information available for each one. The findings show that the ability to match colours was significantly stronger with the color swatches than with the consumer products and that while logistics professionals made more frequent correct identification than the consumers, their results were still unsatisfactorily low. Based on these findings, a proposed universal model of colour identification numbers has been developed.Keywords: consumer preferences, supply chain logistics, radio frequency identification, RFID, colour preference
Procedia PDF Downloads 1212561 d-Block Metal Nanoparticles Confined in Triphenylphosphine Oxide Functionalized Core-Crosslinked Micelles for the Application in Biphasic Hydrogenation
Authors: C. Joseph Abou-Fayssal, K. Philippot, R. Poli, E. Manoury, A. Riisager
Abstract:
The use of soluble polymer-supported metal nanoparticles (MNPs) has received significant attention for the ease of catalyst recovery and recycling. Of particular interest are MNPs that are supported on polymers that are either soluble or form stable colloidal dispersion in water, as this allows to combine of the advantages of the aqueous biphasic protocol with the catalytical performances of MNPs. The objective is to achieve good confinement of the catalyst in the nanoreactor cores and, thus, a better catalyst recovery in order to overcome the previously witnessed MNP extraction. Inspired by previous results, we are interested in the design of polymeric nanoreactors functionalized with ligands able to solidly anchor metallic nanoparticles in order to control the activity and selectivity of the developed nanocatalysts. The nanoreactors are core-crosslinked micelles (CCM) synthesized by reversible addition-fragmentation chain transfer (RAFT) polymerization. Varying the nature of the core-linked functionalities allows us to get differently stabilized metal nanoparticles and thus compare their performance in the catalyzed aqueous biphasic hydrogenation of model substrates. Particular attention is given to catalyst recyclability.Keywords: biphasic catalysis, metal nanoparticles, polymeric nanoreactors, catalyst recovery, RAFT polymerization
Procedia PDF Downloads 1002560 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud
Authors: Sharda Kumari, Saiman Shetty
Abstract:
Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation
Procedia PDF Downloads 1082559 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 2262558 Innovative Screening Tool Based on Physical Properties of Blood
Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan
Abstract:
This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability
Procedia PDF Downloads 376