Search results for: automated theorem proving
774 Method for Improving ICESAT-2 ATL13 Altimetry Data Utility on Rivers
Authors: Yun Chen, Qihang Liu, Catherine Ticehurst, Chandrama Sarker, Fazlul Karim, Dave Penton, Ashmita Sengupta
Abstract:
The application of ICESAT-2 altimetry data in river hydrology critically depends on the accuracy of the mean water surface elevation (WSE) at a virtual station (VS) where satellite observations intersect with water. The ICESAT-2 track generates multiple VSs as it crosses the different water bodies. The difficulties are particularly pronounced in large river basins where there are many tributaries and meanders often adjacent to each other. One challenge is to split photon segments along a beam to accurately partition them to extract only the true representative water height for individual elements. As far as we can establish, there is no automated procedure to make this distinction. Earlier studies have relied on human intervention or river masks. Both approaches are unsatisfactory solutions where the number of intersections is large, and river width/extent changes over time. We describe here an automated approach called “auto-segmentation”. The accuracy of our method was assessed by comparison with river water level observations at 10 different stations on 37 different dates along the Lower Murray River, Australia. The congruence is very high and without detectable bias. In addition, we compared different outlier removal methods on the mean WSE calculation at VSs post the auto-segmentation process. All four outlier removal methods perform almost equally well with the same R2 value (0.998) and only subtle variations in RMSE (0.181–0.189m) and MAE (0.130–0.142m). Overall, the auto-segmentation method developed here is an effective and efficient approach to deriving accurate mean WSE at river VSs. It provides a much better way of facilitating the application of ICESAT-2 ATL13 altimetry to rivers compared to previously reported studies. Therefore, the findings of our study will make a significant contribution towards the retrieval of hydraulic parameters, such as water surface slope along the river, water depth at cross sections, and river channel bathymetry for calculating flow velocity and discharge from remotely sensed imagery at large spatial scales.Keywords: lidar sensor, virtual station, cross section, mean water surface elevation, beam/track segmentation
Procedia PDF Downloads 62773 A Single-Use Endoscopy System for Identification of Abnormalities in the Distal Oesophagus of Individuals with Chronic Reflux
Authors: Nafiseh Mirabdolhosseini, Jerry Zhou, Vincent Ho
Abstract:
The dramatic global rise in acid reflux has also led to oesophageal adenocarcinoma (OAC) becoming the fastest-growing cancer in developed countries. While gastroscopy with biopsy is used to diagnose OAC patients, this labour-intensive and expensive process is not suitable for population screening. This study aims to design, develop, and implement a minimally invasive system to capture optical data of the distal oesophagus for rapid screening of potential abnormalities. To develop the system and understand user requirements, a user-centric approach was employed by utilising co-design strategies. Target users’ segments were identified, and 38 patients and 14 health providers were interviewed. Next, the technical requirements were developed based on consultations with the industry. A minimally invasive optical system was designed and developed considering patient comfort. This system consists of the sensing catheter, controller unit, and analysis program. Its procedure only takes 10 minutes to perform and does not require cleaning afterward since it has a single-use catheter. A prototype system was evaluated for safety and efficacy for both laboratory and clinical performance. This prototype performed successfully when submerged in simulated gastric fluid without showing evidence of erosion after 24 hours. The system effectively recorded a video of the mid-distal oesophagus of a healthy volunteer (34-year-old male). The recorded images were used to develop an automated program to identify abnormalities in the distal oesophagus. Further data from a larger clinical study will be used to train the automated program. This system allows for quick visual assessment of the lower oesophagus in primary care settings and can serve as a screening tool for oesophageal adenocarcinoma. In addition, this system is able to be coupled with 24hr ambulatory pH monitoring to better correlate oesophageal physiological changes with reflux symptoms. It also can provide additional information on lower oesophageal sphincter functions such as opening times and bolus retention.Keywords: endoscopy, MedTech, oesophageal adenocarcinoma, optical system, screening tool
Procedia PDF Downloads 88772 An Equivalence between a Harmonic Form and a Closed Co-Closed Differential Form in L^Q and Non-L^Q Spaces
Abstract:
An equivalent relation between a harmonic form and a closed co-closed form is established on a complete non-compact manifold. This equivalence has been generalized for a differential k-form ω from Lq spaces to non-Lq spaces when q=2 in the context of p-balanced growth where p=2. Especially for a simple differential k-form on a complete non-compact manifold, the equivalent relation has been verified with the extended scope of q for from finite q-energy in Lq spaces to infinite q-energy in non-Lq spaces when with 2-balanced growth. Generalized Hadamard Theorem, Cauchy-Schwarz Inequality, and Calculus skills including Integration by Parts as well as Convergent Series have been applied as estimation techniques to evaluate growth rates for a differential form. In particular, energy growth rates as indicated by an appropriate power range in a selected test function lead to a balance between a harmonic differential form and a closed co-closed differential form. Research ideas and computational methods in this paper could provide an innovative way in the study of broadening Lq spaces to non-Lq spaces with a wide variety of infinite energy growth for a differential form.Keywords: closed forms, co-closed forms, harmonic forms, L^q spaces, p-balanced growth, simple differential k-forms
Procedia PDF Downloads 451771 The KAPSARC Energy Policy Database: Introducing a Quantified Library of China's Energy Policies
Authors: Philipp Galkin
Abstract:
Government policy is a critical factor in the understanding of energy markets. Regardless, it is rarely approached systematically from a research perspective. Gaining a precise understanding of what policies exist, their intended outcomes, geographical extent, duration, evolution, etc. would enable the research community to answer a variety of questions that, for now, are either oversimplified or ignored. Policy, on its surface, also seems a rather unstructured and qualitative undertaking. There may be quantitative components, but incorporating the concept of policy analysis into quantitative analysis remains a challenge. The KAPSARC Energy Policy Database (KEPD) is intended to address these two energy policy research limitations. Our approach is to represent policies within a quantitative library of the specific policy measures contained within a set of legal documents. Each of these measures is recorded into the database as a single entry characterized by a set of qualitative and quantitative attributes. Initially, we have focused on the major laws at the national level that regulate coal in China. However, KAPSARC is engaged in various efforts to apply this methodology to other energy policy domains. To ensure scalability and sustainability of our project, we are exploring semantic processing using automated computer algorithms. Automated coding can provide a more convenient input data for human coders and serve as a quality control option. Our initial findings suggest that the methodology utilized in KEPD could be applied to any set of energy policies. It also provides a convenient tool to facilitate understanding in the energy policy realm enabling the researcher to quickly identify, summarize, and digest policy documents and specific policy measures. The KEPD captures a wide range of information about each individual policy contained within a single policy document. This enables a variety of analyses, such as structural comparison of policy documents, tracing policy evolution, stakeholder analysis, and exploring interdependencies of policies and their attributes with exogenous datasets using statistical tools. The usability and broad range of research implications suggest a need for the continued expansion of the KEPD to encompass a larger scope of policy documents across geographies and energy sectors.Keywords: China, energy policy, policy analysis, policy database
Procedia PDF Downloads 323770 Mesoporous Nanocomposites for Sustained Release Applications
Authors: Daniela Istrati, Alina Morosan, Maria Stanca, Bogdan Purcareanu, Adrian Fudulu, Laura Olariu, Alice Buteica, Ion Mindrila, Rodica Cristescu, Dan Eduard Mihaiescu
Abstract:
Our present work is related to the synthesis, characterization and applications of new nanocomposite materials based on silica mesoporous nanocompozites systems. The nanocomposite support was obtained by using a specific step–by–step multilayer structure buildup synthetic route, characterized by XRD (X-Ray Difraction), TEM (Transmission Electron Microscopy), FT-IR (Fourier Transform-Infra Red Spectrometry), BET (Brunauer–Emmett–Teller method) and loaded with Salvia officinalis plant extract obtained by a hydro-alcoholic extraction route. The sustained release of the target compounds was studied by a modified LC method, proving low release profiles, as expected for the high specific surface area support. The obtained results were further correlated with the in vitro / in vivo behavior of the nanocomposite material and recommending the silica mesoporous nanocomposites as good candidates for biomedical applications. Acknowledgements: This study has been funded by the Research Project PN-III-P2-2.1-PTE-2016-0160, 49-PTE / 2016 (PROZECHIMED) and Project Number PN-III-P4-ID-PCE-2016-0884 / 2017.Keywords: biomedical, mesoporous, nanocomposites, natural products, sustained release
Procedia PDF Downloads 218769 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows
Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham
Abstract:
In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis
Procedia PDF Downloads 65768 The Hawza Al-’Ilmiyya and Its Role in Preserving the Shia Identity through Jurisprudence
Authors: Raied Khayou
Abstract:
The Hawza Al-'Ilmiyya is a network of religious seminaries in the Shia branch of Islam. This research mainly focuses on the oldest school located in Najaf, Iraq, because its core curriculum and main characteristics have been unchanged since the fourth century of Islam. Relying on a thorough literature review of Arabic and English publications, and interviews with current and previous students of the seminary, the current research outlines the factors proving how this seminary was crucial in keeping the Shia religious identity intact despite sometimes gruesome attempts of interference and persecution. There are several factors that helped the seminary to preserve its central importance. First, rooted in their theology, Shia Muslims believe that the Hawza Al-’Ilmiyya and its graduates carry a sacred authority. Secondly, the financial independence of the Seminary helped to keep it intact from any governmental or political meddling. Third, its unique teaching method, its matchless openness for new students, and its flexible curriculum made it attractive for many students who were interested in learning more about Shia theology and jurisprudence. The Hawza Al-‘Ilmiyya has the exclusive right to train clerics who hold the religious authority of Shia Islamic jurisprudence, and the seminary’s success in staying independent throughout history kept Shia Islamic theology independent, as well.Keywords: Hawza Al'Ilmiyya, religious seminary, Shia Muslim education, Islamic jurisprudence
Procedia PDF Downloads 101767 Satellite Connectivity for Sustainable Mobility
Authors: Roberta Mugellesi Dow
Abstract:
As the climate crisis becomes unignorable, it is imperative that new services are developed addressing not only the needs of customers but also taking into account its impact on the environment. The Telecommunication and Integrated Application (TIA) Directorate of ESA is supporting the green transition with particular attention to the sustainable mobility.“Accelerating the shift to sustainable and smart mobility” is at the core of the European Green Deal strategy, which seeks a 90% reduction in related emissions by 2050 . Transforming the way that people and goods move is essential to increasing mobility while decreasing environmental impact, and transport must be considered holistically to produce a shared vision of green intermodal mobility. The use of space technologies, integrated with terrestrial technologies, is an enabler of smarter traffic management and increased transport efficiency for automated and connected multimodal mobility. Satellite connectivity, including future 5G networks, and digital technologies such as Digital Twin, AI, Machine Learning, and cloud-based applications are key enablers of sustainable mobility.SatCom is essential to ensure that connectivity is ubiquitously available, even in remote and rural areas, or in case of a failure, by the convergence of terrestrial and SatCom connectivity networks, This is especially crucial when there are risks of network failures or cyber-attacks targeting terrestrial communication. SatCom ensures communication network robustness and resilience. The combination of terrestrial and satellite communication networks is making possible intelligent and ubiquitous V2X systems and PNT services with significantly enhanced reliability and security, hyper-fast wireless access, as well as much seamless communication coverage. SatNav is essential in providing accurate tracking and tracing capabilities for automated vehicles and in guiding them to target locations. SatNav can also enable location-based services like car sharing applications, parking assistance, and fare payment. In addition to GNSS receivers, wireless connections, radar, lidar, and other installed sensors can enable automated vehicles to monitor surroundings, to ‘talk to each other’ and with infrastructure in real-time, and to respond to changes instantaneously. SatEO can be used to provide the maps required by the traffic management, as well as evaluate the conditions on the ground, assess changes and provide key data for monitoring and forecasting air pollution and other important parameters. Earth Observation derived data are used to provide meteorological information such as wind speed and direction, humidity, and others that must be considered into models contributing to traffic management services. The paper will provide examples of services and applications that have been developed aiming to identify innovative solutions and new business models that are allowed by new digital technologies engaging space and non space ecosystem together to deliver value and providing innovative, greener solutions in the mobility sector. Examples include Connected Autonomous Vehicles, electric vehicles, green logistics, and others. For the technologies relevant are the hybrid satcom and 5G providing ubiquitous coverage, IoT integration with non space technologies, as well as navigation, PNT technology, and other space data.Keywords: sustainability, connectivity, mobility, satellites
Procedia PDF Downloads 133766 Torture and Turkey: Legal Situation Related to Torture in Turkey and the Issue of Impunity of Torture
Authors: Zeynep Üskül Engin
Abstract:
Looking upon the world’s history, one can easily understand that the most drastic and evil comes to the human from his own kind. Human, proving that Hobbs was actually right, finally have agreed on taking some necessary measures after the destructive effects of the great World Wars. Surely after this, human rights have been more commonly mentioned in written form and now the priority of the values and goals of a democratic society is to protect its individuals. Due to this fact, the right of living is found to be valuable and all the existing forms of torture, anti-human and humiliating activities have been banned. Turkey, having signed the international papers of human rights, has aimed for eliminating torture through changing its laws and regulations to a certain extent. Monitoring Turkey’s experience, it is likely to say that during certain periods of time systematic torture has been applied. The urge to enter the European Union and verdicts against Turkey, have led to considerable progress in human rights. Besides, changes in law and the comprehensive training for the police, judges, medical and prison staff have resulted in positive improvement related to this issue. Certainly, this current legal update does not completely mean the total elimination of the practice of torture; however, in the commitment of this crime, the ones who have committed are standing a trial and facing severe punishments. In this article, Turkey, with a notorious reputation in international arena is going to be examined through its policy towards torture and defects in practice.Keywords: torture, human rights, impunity of torture, sociology
Procedia PDF Downloads 463765 Augmenting Classroom Reality
Authors: Kerrin Burnell
Abstract:
In a world of increasingly technology-dependent students, the English language classroom should ideally keep up with developments to keep students engaged as much as possible. Unfortunately, as is the case in Oman, funding is not always adequate to ensure students have the most up to date technology, and most institutions are still reliant on paper-based textbooks. In order to try and bridge the gap between the technology available (smartphones) and textbooks, augmented reality (AR) technology can be utilized to enhance classroom, homework, and extracurricular activities. AR involves overlaying media (videos, images etc) over the top of physical objects (posters, book pages etc) and then sharing the media. This case study involved introducing students to a freely available entry level AR app called Aurasma. Students were asked to augment their English textbooks, word walls, research project posters, and extracurricular posters. Through surveys, interviews and an analysis of time spent accessing the different media, a determination of the appropriateness of the technology for the classroom was determined. Results indicate that the use of AR has positive effects on many aspects of the English classroom. Increased student engagement, total time spent on task, interaction, and motivation were evident, along with a decrease in technology-related anxiety. As it is proving very difficult to get tablets or even laptops in classrooms in Oman, these preliminary results indicate that many positive outcomes will come from introducing students to this innovative technology.Keywords: augmented reality, classroom technology, classroom innovation, engagement
Procedia PDF Downloads 382764 Modified Bat Algorithm for Economic Load Dispatch Problem
Authors: Daljinder Singh, J.S.Dhillon, Balraj Singh
Abstract:
According to no free lunch theorem, a single search technique cannot perform best in all conditions. Optimization method can be attractive choice to solve optimization problem that may have exclusive advantages like robust and reliable performance, global search capability, little information requirement, ease of implementation, parallelism, no requirement of differentiable and continuous objective function. In order to synergize between exploration and exploitation and to further enhance the performance of Bat algorithm, the paper proposed a modified bat algorithm that adds additional search procedure based on bat’s previous experience. The proposed algorithm is used for solving the economic load dispatch (ELD) problem. The practical constraint such valve-point loading along with power balance constraints and generator limit are undertaken. To take care of power demand constraint variable elimination method is exploited. The proposed algorithm is tested on various ELD problems. The results obtained show that the proposed algorithm is capable of performing better in majority of ELD problems considered and is at par with existing algorithms for some of problems.Keywords: bat algorithm, economic load dispatch, penalty method, variable elimination method
Procedia PDF Downloads 459763 Digital Forensics Analysis Focusing on the Onion Router Browser Artifacts in Windows 10
Authors: Zainurrasyid Abdullah, Mohamed Fadzlee Sulaiman, Muhammad Fadzlan Zainal, M. Zabri Adil Talib, Aswami Fadillah M. Ariffin
Abstract:
The Onion Router (Tor) browser is a well-known tool and widely used by people who seeking for web anonymity when browsing the internet. Criminals are taking this advantage to be anonymous over the internet. Accessing the dark web could be the significant reason for the criminal in order for them to perform illegal activities while maintaining their anonymity. For a digital forensic analyst, it is crucial to extract the trail of evidence in proving that the criminal’s computer has used Tor browser to conduct such illegal activities. By applying the digital forensic methodology, several techniques could be performed including application analysis, memory analysis, and registry analysis. Since Windows 10 is the latest operating system released by Microsoft Corporation, this study will use Windows 10 as the operating system platform that running Tor browser. From the analysis, significant artifacts left by Tor browser were discovered such as the execution date, application installation date and browsing history that can be used as an evidence. Although Tor browser was designed to achieved anonymity, there is still some trail of evidence can be found in Windows 10 platform that can be useful for investigation.Keywords: artifacts analysis, digital forensics, forensic analysis, memory analysis, registry analysis, tor browser, Windows 10
Procedia PDF Downloads 170762 Examples of RC Design with Eurocode2
Authors: Carla Ferreira, Helena Barros
Abstract:
The paper termed “Design of reinforced concrete with Eurocode 2” presents the theory regarding the design of reinforced concrete sections and the development of the tables and abacuses to verify the concrete section to the ultimate limit and service limit states. This paper is a complement of it, showing how to use the previous tools. Different numerical results are shown, proving the capability of the methodology. When a section of a beam is already chosen, the computer program presents the reinforcing steel in many locations along the structure, and it is the engineer´s task to choose the layout available for the construction, considering the maximum regular kind of reinforcing bars. There are many computer programs available for this task, but the interest of the present kind of tools is the fast and easy way of making the design and choose the optimal solution. Another application of these design tools is in the definition of the section dimensions, in a way that when stresses are evaluated, the final design is acceptable. In the design offices, these are considered by the engineers a very quick and useful way of designing reinforced concrete sections, employing variable strength concrete and higher steel classes. Examples of nonlinear analyses and redistribution of the bending moment will be considered, according to the Eurocode 2 recommendations, for sections under bending moment and axial forces. Examples of the evaluation of the service limit state will be presented.Keywords: design examples, eurocode 2, reinforced concrete, section design
Procedia PDF Downloads 72761 A Calibration Device for Force-Torque Sensors
Authors: Nicolay Zarutskiy, Roman Bulkin
Abstract:
The paper deals with the existing methods of force-torque sensor calibration with a number of components from one to six, analyzed their advantages and disadvantages, the necessity of introduction of a calibration method. Calibration method and its constructive realization are also described here. A calibration method allows performing automated force-torque sensor calibration both with selected components of the main vector of forces and moments and with complex loading. Thus, two main advantages of the proposed calibration method are achieved: the automation of the calibration process and universality.Keywords: automation, calibration, calibration device, calibration method, force-torque sensors
Procedia PDF Downloads 646760 Student Attendance System Applying Reed Solomon ECC
Authors: Mohd Noah A. Rahman, Armandurni Abd Rahman, Afzaal H. Seyal, Md Rizal Md Hendry
Abstract:
The article reports an automated student attendance system modeled and developed for use at a Vocational school. This project focuses on developing an application using a QR code utilizing the Reed-Solomon error correction code using a smartphone scanned through a webcam. This system enables us to speed up the process of taking attendance and would save us valuable teaching time. This is planned to help students avoid consequences that may result from poor attendances which will eventually penalize them from sitting their final examination as required.Keywords: QR code, Reed-Solomon, error correction, system design.
Procedia PDF Downloads 392759 Artificial Intelligence Based Method in Identifying Tumour Infiltrating Lymphocytes of Triple Negative Breast Cancer
Authors: Nurkhairul Bariyah Baharun, Afzan Adam, Reena Rahayu Md Zin
Abstract:
Tumor microenvironment (TME) in breast cancer is mainly composed of cancer cells, immune cells, and stromal cells. The interaction between cancer cells and their microenvironment plays an important role in tumor development, progression, and treatment response. The TME in breast cancer includes tumor-infiltrating lymphocytes (TILs) that are implicated in killing tumor cells. TILs can be found in tumor stroma (sTILs) and within the tumor (iTILs). TILs in triple negative breast cancer (TNBC) have been demonstrated to have prognostic and potentially predictive value. The international Immune-Oncology Biomarker Working Group (TIL-WG) had developed a guideline focus on the assessment of sTILs using hematoxylin and eosin (H&E)-stained slides. According to the guideline, the pathologists use “eye balling” method on the H&E stained- slide for sTILs assessment. This method has low precision, poor interobserver reproducibility, and is time-consuming for a comprehensive evaluation, besides only counted sTILs in their assessment. The TIL-WG has therefore recommended that any algorithm for computational assessment of TILs utilizing the guidelines provided to overcome the limitations of manual assessment, thus providing highly accurate and reliable TILs detection and classification for reproducible and quantitative measurement. This study is carried out to develop a TNBC digital whole slide image (WSI) dataset from H&E-stained slides and IHC (CD4+ and CD8+) stained slides. TNBC cases were retrieved from the database of the Department of Pathology, Hospital Canselor Tuanku Muhriz (HCTM). TNBC cases diagnosed between the year 2010 and 2021 with no history of other cancer and available block tissue were included in the study (n=58). Tissue blocks were sectioned approximately 4 µm for H&E and IHC stain. The H&E staining was performed according to a well-established protocol. Indirect IHC stain was also performed on the tissue sections using protocol from Diagnostic BioSystems PolyVue™ Plus Kit, USA. The slides were stained with rabbit monoclonal, CD8 antibody (SP16) and Rabbit monoclonal, CD4 antibody (EP204). The selected and quality-checked slides were then scanned using a high-resolution whole slide scanner (Pannoramic DESK II DW- slide scanner) to digitalize the tissue image with a pixel resolution of 20x magnification. A manual TILs (sTILs and iTILs) assessment was then carried out by the appointed pathologist (2 pathologists) for manual TILs scoring from the digital WSIs following the guideline developed by TIL-WG 2014, and the result displayed as the percentage of sTILs and iTILs per mm² stromal and tumour area on the tissue. Following this, we aimed to develop an automated digital image scoring framework that incorporates key elements of manual guidelines (including both sTILs and iTILs) using manually annotated data for robust and objective quantification of TILs in TNBC. From the study, we have developed a digital dataset of TNBC H&E and IHC (CD4+ and CD8+) stained slides. We hope that an automated based scoring method can provide quantitative and interpretable TILs scoring, which correlates with the manual pathologist-derived sTILs and iTILs scoring and thus has potential prognostic implications.Keywords: automated quantification, digital pathology, triple negative breast cancer, tumour infiltrating lymphocytes
Procedia PDF Downloads 116758 Modeling, Analysis and Control of a Smart Composite Structure
Authors: Nader H. Ghareeb, Mohamed S. Gaith, Sayed M. Soleimani
Abstract:
In modern engineering, weight optimization has a priority during the design of structures. However, optimizing the weight can result in lower stiffness and less internal damping, causing the structure to become excessively prone to vibration. To overcome this problem, active or smart materials are implemented. The coupled electromechanical properties of smart materials, used in the form of piezoelectric ceramics in this work, make these materials well-suited for being implemented as distributed sensors and actuators to control the structural response. The smart structure proposed in this paper is composed of a cantilevered steel beam, an adhesive or bonding layer, and a piezoelectric actuator. The static deflection of the structure is derived as function of the piezoelectric voltage, and the outcome is compared to theoretical and experimental results from literature. The relation between the voltage and the piezoelectric moment at both ends of the actuator is also investigated and a reduced finite element model of the smart structure is created and verified. Finally, a linear controller is implemented and its ability to attenuate the vibration due to the first natural frequency is demonstrated.Keywords: active linear control, lyapunov stability theorem, piezoelectricity, smart structure, static deflection
Procedia PDF Downloads 387757 Cardenolides from the Egyptian Cultivar: Acokanthera spectabilis Leaves Inducing Apoptosis through Arresting Hepatocellular Carcinoma Growth at G2/M
Authors: Maha Soltan, Amal Z. Hassan, Howaida I. Abd-Alla, Atef G. Hanna
Abstract:
Two naturally known cardenolides; acovenoside A and acobioside A were isolated from the Egyptian cultivar; Acokanthera spectabilis leaves. It is an ornamental and poisonous plant that has been traditionally claimed for their medicinal properties against infectious microbes, killing worms and curing some inflammations at little amounts. We examined the growth inhibition effects of both cardenolides against four types of human cancer cell lines using Sulphorhodamine B assay. In addition, the clonogenic assay was also performed for testing the growth inhibiting power of the isolated compounds. An in vitro mechanistic investigation was further accomplished against hepatocellular carcinoma HepG2 cell line. Microscopic examination, colorimetric ELISA and flow cytometry techniques were our tools of proving at least part of the anticancer pathway of the tested compounds. Both compounds were able to inhibit the growth of 4 human cancer cell lines at less than 100 nM. In addition, they were able to activate the executioner Caspase-3 and apoptosis was then induced as a consequence of cell growth arrest at G2/M. An attention must be payed to those bioactive agents particularly when giving their activity against cancer cells at considerable small values while presenting safe therapeutic margins as indicated by literature.Keywords: anticancer, cardenolides, Caspase-3, apoptosis
Procedia PDF Downloads 147756 Non-Invasive Assessment of Peripheral Arterial Disease: Automated Ankle Brachial Index Measurement and Pulse Volume Analysis Compared to Ultrasound Duplex Scan
Authors: Jane E. A. Lewis, Paul Williams, Jane H. Davies
Abstract:
Introduction: There is, at present, a clear and recognized need to optimize the diagnosis of peripheral arterial disease (PAD), particularly in non-specialist settings such as primary care, and this arises from several key facts. Firstly, PAD is a highly prevalent condition. In 2010, it was estimated that globally, PAD affected more than 202 million people and furthermore, this prevalence is predicted to further escalate. The disease itself, although frequently asymptomatic, can cause considerable patient suffering with symptoms such as lower limb pain, ulceration, and gangrene which, in worse case scenarios, can necessitate limb amputation. A further and perhaps the most eminent consequence of PAD arises from the fact that it is a manifestation of systemic atherosclerosis and therefore is a powerful predictor of coronary heart disease and cerebrovascular disease. Objective: This cross sectional study aimed to individually and cumulatively compare sensitivity and specificity of the (i) ankle brachial index (ABI) and (ii) pulse volume waveform (PVW) recorded by the same automated device, with the presence or absence of peripheral arterial disease (PAD) being verified by an Ultrasound Duplex Scan (UDS). Methods: Patients (n = 205) referred for lower limb arterial assessment underwent an ABI and PVW measurement using volume plethysmography followed by a UDS. Presence of PAD was recorded for ABI if < 0.9 (noted if > 1.30) if PVW was graded as 2, 3 or 4 or a hemodynamically significant stenosis > 50% with UDS. Outcome measure was agreement between measured ABI and interpretation of the PVW for PAD diagnosis, using UDS as the reference standard. Results: Sensitivity of ABI was 80%, specificity 91%, and overall accuracy 88%. Cohen’s kappa revealed good agreement between ABI and UDS (k = 0.7, p < .001). PVW sensitivity 97%, specificity 81%, overall accuracy 84%, with a good level of agreement between PVW and UDS (k = 0.67, p < .001). The combined sensitivity of ABI and PVW was 100%, specificity 76%, and overall accuracy 85% (k = 0.67, p < .001). Conclusions: Combing these two diagnostic modalities within one device provided a highly accurate method of ruling out PAD. Such a device could be utilized within the primary care environment to reduce the number of unnecessary referrals to secondary care with concomitant cost savings, reduced patient inconvenience, and prioritization of urgent PAD cases.Keywords: ankle brachial index, peripheral arterial disease, pulse volume waveform, ultrasound duplex scan
Procedia PDF Downloads 166755 Pyramid Binary Pattern for Age Invariant Face Verification
Authors: Saroj Bijarnia, Preety Singh
Abstract:
We propose a simple and effective biometrics system based on face verification across aging using a new variant of texture feature, Pyramid Binary Pattern. This employs Local Binary Pattern along with its hierarchical information. Dimension reduction of generated texture feature vector is done using Principal Component Analysis. Support Vector Machine is used for classification. Our proposed method achieves an accuracy of 92:24% and can be used in an automated age-invariant face verification system.Keywords: biometrics, age invariant, verification, support vector machine
Procedia PDF Downloads 353754 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions
Authors: Joel Niklaus, Matthias Sturmer
Abstract:
The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling
Procedia PDF Downloads 148753 Simultaneous Quantification of Glycols in New and Recycled Anti-Freeze Liquids by GC-MS
Authors: George Madalin Danila, Mihaiella Cretu, Cristian Puscasu
Abstract:
Glycol-based anti-freeze liquids, commonly composed of ethylene glycol or propylene glycol, have important uses in automotive cooling, but they should be handled with care due to their toxicity; ethylene glycol is highly toxic to humans and animals. A fast, accurate, precise, and robust method was developed for the simultaneous quantification of 7 most important glycols and their isomers. Glycols were analyzed from diluted sample solution of coolants using gas-chromatography coupled with mass spectrometry in single ion monitoring mode. Results: The method was developed and validated for 7 individual glycols (ethylene glycol, diethylene glycol, triethylene glycol, tetraethylene glycol, propylene glycol, dipropylene glycol and tripropylene glycol). Limits of detection (1-2 μg/mL) and limit of quantification (10 μg/mL) obtained were appropriate. The present method was applied for the determination of glycols in 10 different anti-freeze liquids commercially available on the Romanian market, proving to be reliable. A method that requires only a two-step dilution of anti-freeze samples combined with direct liquid injection GC-MS was validated for the simultaneous quantification of 7 glycols (and their isomers) in 10 different types of anti-freeze liquids. The results obtained in the validation procedure proved that the GC-MS method is sensitive and precise for the quantification of glycols.Keywords: glycols, anti-freeze, gas-chromatography, mass spectrometry, validation, recycle
Procedia PDF Downloads 66752 Open Forging of Cylindrical Blanks Subjected to Lateral Instability
Authors: A. H. Elkholy, D. M. Almutairi
Abstract:
The successful and efficient execution of a forging process is dependent upon the correct analysis of loading and metal flow of blanks. This paper investigates the Upper Bound Technique (UBT) and its application in the analysis of open forging process when a possibility of blank bulging exists. The UBT is one of the energy rate minimization methods for the solution of metal forming process based on the upper bound theorem. In this regards, the kinematically admissible velocity field is obtained by minimizing the total forging energy rate. A computer program is developed in this research to implement the UBT. The significant advantages of this method is the speed of execution while maintaining a fairly high degree of accuracy and the wide prediction capability. The information from this analysis is useful for the design of forging processes and dies. Results for the prediction of forging loads and stresses, metal flow and surface profiles with the assured benefits in terms of press selection and blank preform design are outlined in some detail. The obtained predictions are ready for comparison with both laboratory and industrial results.Keywords: forging, upper bound technique, metal forming, forging energy, forging die/platen
Procedia PDF Downloads 293751 Laban Movement Analysis Using Kinect
Authors: Bernstein Ran, Shafir Tal, Tsachor Rachelle, Studd Karen, Schuster Assaf
Abstract:
Laban Movement Analysis (LMA), developed in the dance community over the past seventy years, is an effective method for observing, describing, notating, and interpreting human movement to enhance communication and expression in everyday and professional life. Many applications that use motion capture data might be significantly leveraged if the Laban qualities will be recognized automatically. This paper presents an automated recognition method of Laban qualities from motion capture skeletal recordings and it is demonstrated on the output of Microsoft’s Kinect V2 sensor.Keywords: Laban movement analysis, multitask learning, Kinect sensor, machine learning
Procedia PDF Downloads 342750 Transforming Breast Density Measurement with Artificial Intelligence: Population-Level Insights from BreastScreen NSW
Authors: Douglas Dunn, Ricahrd Walton, Matthew Warner-Smith, Chirag Mistry, Kan Ren, David Roder
Abstract:
Introduction: Breast density is a risk factor for breast cancer, both due to increased fibro glandular tissue that can harbor malignancy and the masking of lesions on mammography. Therefore, evaluation of breast density measurement is useful for risk stratification on an individual and population level. This study investigates the performance of Lunit INSIGHT MMG for automated breast density measurement. We analyze the reliability of Lunit compared to breast radiologists, explore density variations across the BreastScreen NSW population, and examine the impact of breast implants on density measurements. Methods: 15,518 mammograms were utilized for a comparative analysis of intra- and inter-reader reliability between Lunit INSIGHT MMG and breast radiologists. Subsequently, Lunit was used to evaluate 624,113 mammograms for investigation of density variations according to age and birth country, providing insights into diverse population subgroups. Finally, we compared breast density in 4,047 clients with implants to clients without implants, controlling for age and birth country. Results: Inter-reader variability between Lunit and Breast Radiologists weighted kappa coefficient was 0.72 (95%CI 0.71-0.73). Highest breast densities were seen in women with a North-East Asia background, whilst those of Aboriginal background had the lowest density. Across all backgrounds, density was demonstrated to reduce with age, though at different rates according to country of birth. Clients with implants had higher density relative to the age-matched no-implant strata. Conclusion: Lunit INSIGHT MMG demonstrates reasonable inter- and intra-observer reliability for automated breast density measurement. The scale of this study is significantly larger than any previous study assessing breast density due to the ability to process large volumes of data using AI. As a result, it provides valuable insights into population-level density variations. Our findings highlight the influence of age, birth country, and breast implants on density, emphasizing the need for personalized risk assessment and screening approaches. The large-scale and diverse nature of this study enhances the generalisability of our results, offering valuable information for breast cancer screening programs internationally.Keywords: breast cancer, screening, breast density, artificial intelligence, mammography
Procedia PDF Downloads 4749 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach
Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes
Abstract:
In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.Keywords: banking institutions, experimental approach, money laundering, risk assessment
Procedia PDF Downloads 267748 Automated Localization of Palpebral Conjunctiva and Hemoglobin Determination Using Smart Phone Camera
Authors: Faraz Tahir, M. Usman Akram, Albab Ahmad Khan, Mujahid Abbass, Ahmad Tariq, Nuzhat Qaiser
Abstract:
The objective of this study was to evaluate the Degree of anemia by taking the picture of the palpebral conjunctiva using Smartphone Camera. We have first localized the region of interest from the image and then extracted certain features from that Region of interest and trained SVM classifier on those features and then, as a result, our system classifies the image in real-time on their level of hemoglobin. The proposed system has given an accuracy of 70%. We have trained our classifier on a locally gathered dataset of 30 patients.Keywords: anemia, palpebral conjunctiva, SVM, smartphone
Procedia PDF Downloads 506747 Enhancing Patch Time Series Transformer with Wavelet Transform for Improved Stock Prediction
Authors: Cheng-yu Hsieh, Bo Zhang, Ahmed Hambaba
Abstract:
Stock market prediction has long been an area of interest for both expert analysts and investors, driven by its complexity and the noisy, volatile conditions it operates under. This research examines the efficacy of combining the Patch Time Series Transformer (PatchTST) with wavelet transforms, specifically focusing on Haar and Daubechies wavelets, in forecasting the adjusted closing price of the S&P 500 index for the following day. By comparing the performance of the augmented PatchTST models with traditional predictive models such as Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformers, this study highlights significant enhancements in prediction accuracy. The integration of the Daubechies wavelet with PatchTST notably excels, surpassing other configurations and conventional models in terms of Mean Absolute Error (MAE) and Mean Squared Error (MSE). The success of the PatchTST model paired with Daubechies wavelet is attributed to its superior capability in extracting detailed signal information and eliminating irrelevant noise, thus proving to be an effective approach for financial time series forecasting.Keywords: deep learning, financial forecasting, stock market prediction, patch time series transformer, wavelet transform
Procedia PDF Downloads 50746 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses
Authors: Matthew Baucum
Abstract:
With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.Keywords: FMRI, machine learning, meta-analysis, text analysis
Procedia PDF Downloads 449745 Theory and Practice of Wavelets in Signal Processing
Authors: Jalal Karam
Abstract:
The methods of Fourier, Laplace, and Wavelet Transforms provide transfer functions and relationships between the input and the output signals in linear time invariant systems. This paper shows the equivalence among these three methods and in each case presenting an application of the appropriate (Fourier, Laplace or Wavelet) to the convolution theorem. In addition, it is shown that the same holds for a direct integration method. The Biorthogonal wavelets Bior3.5 and Bior3.9 are examined and the zeros distribution of their polynomials associated filters are located. This paper also presents the significance of utilizing wavelets as effective tools in processing speech signals for common multimedia applications in general, and for recognition and compression in particular. Theoretically and practically, wavelets have proved to be effective and competitive. The practical use of the Continuous Wavelet Transform (CWT) in processing and analysis of speech is then presented along with explanations of how the human ear can be thought of as a natural wavelet transformer of speech. This generates a variety of approaches for applying the (CWT) to many paradigms analysing speech, sound and music. For perception, the flexibility of implementation of this transform allows the construction of numerous scales and we include two of them. Results for speech recognition and speech compression are then included.Keywords: continuous wavelet transform, biorthogonal wavelets, speech perception, recognition and compression
Procedia PDF Downloads 416