Search results for: evidence based nursing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30636

Search results for: evidence based nursing

23946 Local Differential Privacy-Based Data-Sharing Scheme for Smart Utilities

Authors: Veniamin Boiarkin, Bruno Bogaz Zarpelão, Muttukrishnan Rajarajan

Abstract:

The manufacturing sector is a vital component of most economies, which leads to a large number of cyberattacks on organisations, whereas disruption in operation may lead to significant economic consequences. Adversaries aim to disrupt the production processes of manufacturing companies, gain financial advantages, and steal intellectual property by getting unauthorised access to sensitive data. Access to sensitive data helps organisations to enhance the production and management processes. However, the majority of the existing data-sharing mechanisms are either susceptible to different cyber attacks or heavy in terms of computation overhead. In this paper, a privacy-preserving data-sharing scheme for smart utilities is proposed. First, a customer’s privacy adjustment mechanism is proposed to make sure that end-users have control over their privacy, which is required by the latest government regulations, such as the General Data Protection Regulation. Secondly, a local differential privacy-based mechanism is proposed to ensure the privacy of the end-users by hiding real data based on the end-user preferences. The proposed scheme may be applied to different industrial control systems, whereas in this study, it is validated for energy utility use cases consisting of smart, intelligent devices. The results show that the proposed scheme may guarantee the required level of privacy with an expected relative error in utility.

Keywords: data-sharing, local differential privacy, manufacturing, privacy-preserving mechanism, smart utility

Procedia PDF Downloads 70
23945 A Static Android Malware Detection Based on Actual Used Permissions Combination and API Calls

Authors: Xiaoqing Wang, Junfeng Wang, Xiaolan Zhu

Abstract:

Android operating system has been recognized by most application developers because of its good open-source and compatibility, which enriches the categories of applications greatly. However, it has become the target of malware attackers due to the lack of strict security supervision mechanisms, which leads to the rapid growth of malware, thus bringing serious safety hazards to users. Therefore, it is critical to detect Android malware effectively. Generally, the permissions declared in the AndroidManifest.xml can reflect the function and behavior of the application to a large extent. Since current Android system has not any restrictions to the number of permissions that an application can request, developers tend to apply more than actually needed permissions in order to ensure the successful running of the application, which results in the abuse of permissions. However, some traditional detection methods only consider the requested permissions and ignore whether it is actually used, which leads to incorrect identification of some malwares. Therefore, a machine learning detection method based on the actually used permissions combination and API calls was put forward in this paper. Meanwhile, several experiments are conducted to evaluate our methodology. The result shows that it can detect unknown malware effectively with higher true positive rate and accuracy while maintaining a low false positive rate. Consequently, the AdaboostM1 (J48) classification algorithm based on information gain feature selection algorithm has the best detection result, which can achieve an accuracy of 99.8%, a true positive rate of 99.6% and a lowest false positive rate of 0.

Keywords: android, API Calls, machine learning, permissions combination

Procedia PDF Downloads 323
23944 Targeting the EphA2 Receptor Tyrosine Kinases in Melanoma Cancer, both in Humans and Dogs

Authors: Shabnam Abdi, Behzad Toosi

Abstract:

Background: Melanoma is the most lethal type of malignant skin cancer in humans and dogs since it spreads rapidly throughout the body. Despite significant advances in treatment, cancer at an advanced stage has a poor prognosis. Hence, more effective treatments are needed to enhance outcomes with fewer side effects. Erythropoietin-producing hepatocellular receptors are the largest family of receptor tyrosine kinases and are divided into two subfamilies, EphA and EphB, both of which play a significant role in disease, especially cancer. Due to their association with proliferation and invasion in many aggressive types of cancer, Eph receptor tyrosine kinases (Eph RTKs) are promising cancer therapy molecules. Because these receptors have not been studied in canine melanoma, we investigated how EphA2 influences survival and tumorigenicity of melanoma cells. Methods: Expression of EphA2 protein in canine melanoma cell lines and human melanoma cell line was evaluated by Western blot. Melanoma cells were transduced with lentiviral particles encoding Eph-targeting shRNAs or non-silencing shRNAs (control) for silencing the expression of EphA2 receptor, and silencing was confirmed by Western blotting and immunofluorescence. The effect of siRNA treatment on cellular proliferation, colony formation, tumorsphere assay, invasion was analyzed by Resazurin assay Matrigel invasion assay, respectively. Results: Expression of EphA2 was detected in canine and human melanoma cell lines. Moreover, stably silencing EphA2 by specific shRNAs significantly and consistently decreased the expression of EphA2 protein in both human and canine melanoma cells. Proliferation, colony formation, tumorsphere and invasion of melanoma cells were significantly decreased in EphA2 siRNA-treated cells compared to control. Conclusion: Our data provide the first functional evidence that the EphA2 receptor plays a critical role in the malignant cellular behavior of melanoma in both human and dogs.

Keywords: ephA2, targeting, melanoma, human, canine

Procedia PDF Downloads 54
23943 Synthesis and Characterisation of Bio-Based Acetals Derived from Eucalyptus Oil

Authors: Kirstin Burger, Paul Watts, Nicole Vorster

Abstract:

Green chemistry focuses on synthesis which has a low negative impact on the environment. This research focuses on synthesizing novel compounds from an all-natural Eucalyptus citriodora oil. Eight novel plasticizer compounds are synthesized and optimized using flow chemistry technology. A precursor to one novel compound can be synthesized from the lauric acid present in coconut oil. Key parameters, such as catalyst screening and loading, reaction time, temperature, residence time using flow chemistry techniques is investigated. The compounds are characterised using GC-MS, FT-IR, 1H and 13C-NMR techniques, X-ray crystallography. The efficiency of the compounds is compared to two commercial plasticizers, i.e. Dibutyl phthalate and Eastman 168. Several PVC-plasticized film formulations are produced using the bio-based novel compounds. Tensile strength, stress at fracture and percentage elongation are tested. The property of having increasing plasticizer percentage in the film formulations is investigated, ranging from 3, 6, 9 and 12%. The diastereoisomers of each compound are separated and formulated into PVC films, and differences in tensile strength are measured. Leaching tests, flexibility, and change in glass transition temperatures for PVC-plasticized films is recorded. Research objective includes using these novel compounds as a green bio-plasticizer alternative in plastic products for infants. The inhibitory effect of the compounds on six pathogens effecting infants are studied, namely; Escherichia coli, Staphylococcus aureus, Shigella sonnei, Pseudomonas putida, Salmonella choleraesuis and Klebsiella oxytoca.

Keywords: bio-based compounds, plasticizer, tensile strength, microbiological inhibition , synthesis

Procedia PDF Downloads 178
23942 Engineering of Reagentless Fluorescence Biosensors Based on Single-Chain Antibody Fragments

Authors: Christian Fercher, Jiaul Islam, Simon R. Corrie

Abstract:

Fluorescence-based immunodiagnostics are an emerging field in biosensor development and exhibit several advantages over traditional detection methods. While various affinity biosensors have been developed to generate a fluorescence signal upon sensing varying concentrations of analytes, reagentless, reversible, and continuous monitoring of complex biological samples remains challenging. Here, we aimed to genetically engineer biosensors based on single-chain antibody fragments (scFv) that are site-specifically labeled with environmentally sensitive fluorescent unnatural amino acids (UAA). A rational design approach resulted in quantifiable analyte-dependent changes in peak fluorescence emission wavelength and enabled antigen detection in vitro. Incorporation of a polarity indicator within the topological neighborhood of the antigen-binding interface generated a titratable wavelength blueshift with nanomolar detection limits. In order to ensure continuous analyte monitoring, scFv candidates with fast binding and dissociation kinetics were selected from a genetic library employing a high-throughput phage display and affinity screening approach. Initial rankings were further refined towards rapid dissociation kinetics using bio-layer interferometry (BLI) and surface plasmon resonance (SPR). The most promising candidates were expressed, purified to homogeneity, and tested for their potential to detect biomarkers in a continuous microfluidic-based assay. Variations of dissociation kinetics within an order of magnitude were achieved without compromising the specificity of the antibody fragments. This approach is generally applicable to numerous antibody/antigen combinations and currently awaits integration in a wide range of assay platforms for one-step protein quantification.

Keywords: antibody engineering, biosensor, phage display, unnatural amino acids

Procedia PDF Downloads 140
23941 Assessment of Students Skills in Error Detection in SQL Classes using Rubric Framework - An Empirical Study

Authors: Dirson Santos De Campos, Deller James Ferreira, Anderson Cavalcante Gonçalves, Uyara Ferreira Silva

Abstract:

Rubrics to learning research provide many evaluation criteria and expected performance standards linked to defined student activity for learning and pedagogical objectives. Despite the rubric being used in education at all levels, academic literature on rubrics as a tool to support research in SQL Education is quite rare. There is a large class of SQL queries is syntactically correct, but certainly, not all are semantically correct. Detecting and correcting errors is a recurring problem in SQL education. In this paper, we usthe Rubric Abstract Framework (RAF), which consists of steps, that allows us to map the information to measure student performance guided by didactic objectives defined by the teacher as long as it is contextualized domain modeling by rubric. An empirical study was done that demonstrates how rubrics can mitigate student difficulties in finding logical errors and easing teacher workload in SQL education. Detecting and correcting logical errors is an important skill for students. Researchers have proposed several ways to improve SQL education because understanding this paradigm skills are crucial in software engineering and computer science. The RAF instantiation was using in an empirical study developed during the COVID-19 pandemic in database course. The pandemic transformed face-to-face and remote education, without presential classes. The lab activities were conducted remotely, which hinders the teaching-learning process, in particular for this research, in verifying the evidence or statements of knowledge, skills, and abilities (KSAs) of students. Various research in academia and industry involved databases. The innovation proposed in this paper is the approach used where the results obtained when using rubrics to map logical errors in query formulation have been analyzed with gains obtained by students empirically verified. The research approach can be used in the post-pandemic period in both classroom and distance learning.

Keywords: rubric, logical error, structured query language (SQL), empirical study, SQL education

Procedia PDF Downloads 183
23940 Characterising Rates of Renal Dysfunction and Sarcoidosis in Patients with Elevated Serum Angiotensin-Converting Enzyme

Authors: Fergal Fouhy, Alan O’Keeffe, Sean Costelloe, Michael Clarkson

Abstract:

Background: Sarcoidosis is a systemic, non-infectious disease of unknown aetiology, characterized by non-caseating granulomatous inflammation. The lung is most often affected (90%); however, the condition can affect all organs, including the kidneys. There is limited evidence describing the incidence and characteristics of renal involvement in sarcoidosis. Serum angiotensin-converting enzyme (ACE) is a recognised biomarker used in the diagnosis and monitoring of sarcoidosis. Methods: A single-centre, retrospective cohort study of patients presenting to Cork University Hospital (CUH) in 2015 with first-time elevations of serum ACE was performed. This included an initial database review of ACE and other biochemistry results, followed by a medical chart review to confirm the presence or absence of sarcoidosis and management thereof. Acute kidney injury (AKI) was staged using the AKIN criteria, and chronic kidney disease (CKD) was staged using the KDIGO criteria. Follow-up was assessed over five years tracking serum creatinine, serum calcium, and estimated glomerular filtration rates (eGFR). Results: 119 patients were identified as having a first raised serum ACE in 2015. Seventy-nine male patients and forty female patients were identified. The mean age of patients identified was 47 years old. 11% had CKD at baseline. 18% developed an AKI at least once within the next five years. A further 6% developed CKD during this time period. 13% developed hypercalcemia. The patients within the lowest quartile of serums ACE had an incidence of sarcoidosis of 5%. None of this group developed hypercalcemia, 23% developed AKI, and 7% developed CKD. Of the patients with a serum ACE in the highest quartile, almost all had documented diagnoses of sarcoidosis with an incidence of 96%. 3% of this group developed hypercalcemia, 13% AKI and 3% developed CKD. Conclusions: There was an unexpectedly high incidence of AKI in patients who had a raised serum ACE. Not all patients with a raised serum ACE had a confirmed diagnosis of sarcoidosis. There does not appear to be a relationship between increased serum ACE levels and increased incidence of hypercalcaemia, AKI, and CKD. Ideally, all patients should have biopsy-proven sarcoidosis. This is an initial study that should be replicated with larger numbers and including multiple centres.

Keywords: sarcoidosis, acute kidney injury, chronic kidney disease, hypercalcemia

Procedia PDF Downloads 94
23939 Analysis of Citation Rate and Data Reuse for Openly Accessible Biodiversity Datasets on Global Biodiversity Information Facility

Authors: Nushrat Khan, Mike Thelwall, Kayvan Kousha

Abstract:

Making research data openly accessible has been mandated by most funders over the last 5 years as it promotes reproducibility in science and reduces duplication of effort to collect the same data. There are evidence that articles that publicly share research data have higher citation rates in biological and social sciences. However, how and whether shared data is being reused is not always intuitive as such information is not easily accessible from the majority of research data repositories. This study aims to understand the practice of data citation and how data is being reused over the years focusing on biodiversity since research data is frequently reused in this field. Metadata of 38,878 datasets including citation counts were collected through the Global Biodiversity Information Facility (GBIF) API for this purpose. GBIF was used as a data source since it provides citation count for datasets, not a commonly available feature for most repositories. Analysis of dataset types, citation counts, creation and update time of datasets suggests that citation rate varies for different types of datasets, where occurrence datasets that have more granular information have higher citation rates than checklist and metadata-only datasets. Another finding is that biodiversity datasets on GBIF are frequently updated, which is unique to this field. Majority of the datasets from the earliest year of 2007 were updated after 11 years, with no dataset that was not updated since creation. For each year between 2007 and 2017, we compared the correlations between update time and citation rate of four different types of datasets. While recent datasets do not show any correlations, 3 to 4 years old datasets show weak correlation where datasets that were updated more recently received high citations. The results are suggestive that it takes several years to cumulate citations for research datasets. However, this investigation found that when searched on Google Scholar or Scopus databases for the same datasets, the number of citations is often not the same as GBIF. Hence future aim is to further explore the citation count system adopted by GBIF to evaluate its reliability and whether it can be applicable to other fields of studies as well.

Keywords: data citation, data reuse, research data sharing, webometrics

Procedia PDF Downloads 173
23938 LuMee: A Centralized Smart Protector for School Children who are Using Online Education

Authors: Lumindu Dilumka, Ranaweera I. D., Sudusinghe S. P., Sanduni Kanchana A. M. K.

Abstract:

This study was motivated by the challenges experienced by parents and guardians in ensuring the safety of children in cyberspace. In the last two or three years, online education has become very popular all over the world due to the Covid 19 pandemic. Therefore, parents, guardians and teachers must ensure the safety of children in cyberspace. Children are more likely to go astray and there are plenty of online programs are waiting to get them on the wrong track and also, children who are engaging in the online education can be distracted at any moment. Therefore, parents should keep a close check on their children's online activity. Apart from that, due to the unawareness of children, they tempt to share their sensitive information, causing a chance of being a victim of phishing attacks from outsiders. These problems can be overcome through the proposed web-based system. We use feature extraction, web tracking and analysis mechanisms, image processing and name entity recognition to implement this web-based system.

Keywords: online education, cyber bullying, social media, face recognition, web tracker, privacy data

Procedia PDF Downloads 81
23937 Water Self Sufficient: Creating a Sustainable Water System Based on Urban Harvest Approach in La Serena, Chile

Authors: Zulfikar Dinar Wahidayat Putra

Abstract:

Water scarcity become a major challenge in an arid area. One of the arid areas is La Serena city in the Northern Chile which become a case study of this paper. Based on that, this paper tries to identify a sustainable water system by using urban harvest approach as a method to achieve water self-sufficiency for a neighborhood area in the La Serena city. By using the method, it is possible to create sustainable water system in the neighborhood area by reducing up to 38% of water demand and 94% of wastewater production even though water self-sufficient cannot be fully achieved, because of its dependency to the drinking water supply from water treatment plant of La Serena city.

Keywords: arid area, sustainable water system, urban harvest approach, self-sufficiency

Procedia PDF Downloads 258
23936 Determinants of Carbon-Certified Small-Scale Agroforestry Adoption In Rural Mount Kenyan

Authors: Emmanuel Benjamin, Matthias Blum

Abstract:

Purpose – We address smallholder farmers’ restricted possibilities to adopt sustainable technologies which have direct and indirect benefits. Smallholders often face little asset endowment due to small farm size und insecure property rights, therefore experiencing constraints in adopting agricultural innovation. A program involving payments for ecosystem services (PES) benefits poor smallholder farmers in developing countries in many ways and has been suggested as a means of easing smallholder farmers’ financial constraints. PES may also provide additional mainstay which can eventually result in more favorable credit contract terms due to the availability of collateral substitute. Results of this study may help to understand the barriers, motives and incentives for smallholders’ participation in PES and help in designing a strategy to foster participation in beneficial programs. Design/methodology/approach – This paper uses a random utility model and a logistic regression approach to investigate factors that influence agroforestry adoption. We investigate non-monetary factors, such as information spillover, that influence the decision to adopt such conservation strategies. We collected original data from non-government-run agroforestry mitigation programs with PES that have been implemented in the Mount Kenya region. Preliminary Findings – We find that spread of information, existing networks and peer involvement in such programs drive participation. Conversely, participation by smallholders does not seem to be influenced by education, land or asset endowment. Contrary to some existing literature, we found weak evidence for a positive correlation between the adoption of agroforestry with PES and age of smallholder, e.g., one increases with the other, in the Mount Kenyan region. Research implications – Poverty alleviation policies for developing countries should target social capital to increase the adoption rate of modern technologies amongst smallholders.

Keywords: agriculture innovation, agroforestry adoption, smallholders, payment for ecosystem services, Sub-Saharan Africa

Procedia PDF Downloads 372
23935 Endocardial Ultrasound Segmentation using Level Set method

Authors: Daoudi Abdelaziz, Mahmoudi Saïd, Chikh Mohamed Amine

Abstract:

This paper presents a fully automatic segmentation method of the left ventricle at End Systolic (ES) and End Diastolic (ED) in the ultrasound images by means of an implicit deformable model (level set) based on Geodesic Active Contour model. A pre-processing Gaussian smoothing stage is applied to the image, which is essential for a good segmentation. Before the segmentation phase, we locate automatically the area of the left ventricle by using a detection approach based on the Hough Transform method. Consequently, the result obtained is used to automate the initialization of the level set model. This initial curve (zero level set) deforms to search the Endocardial border in the image. On the other hand, quantitative evaluation was performed on a data set composed of 15 subjects with a comparison to ground truth (manual segmentation).

Keywords: level set method, transform Hough, Gaussian smoothing, left ventricle, ultrasound images.

Procedia PDF Downloads 461
23934 Model of Optimal Centroids Approach for Multivariate Data Classification

Authors: Pham Van Nha, Le Cam Binh

Abstract:

Particle swarm optimization (PSO) is a population-based stochastic optimization algorithm. PSO was inspired by the natural behavior of birds and fish in migration and foraging for food. PSO is considered as a multidisciplinary optimization model that can be applied in various optimization problems. PSO’s ideas are simple and easy to understand but PSO is only applied in simple model problems. We think that in order to expand the applicability of PSO in complex problems, PSO should be described more explicitly in the form of a mathematical model. In this paper, we represent PSO in a mathematical model and apply in the multivariate data classification. First, PSOs general mathematical model (MPSO) is analyzed as a universal optimization model. Then, Model of Optimal Centroids (MOC) is proposed for the multivariate data classification. Experiments were conducted on some benchmark data sets to prove the effectiveness of MOC compared with several proposed schemes.

Keywords: analysis of optimization, artificial intelligence based optimization, optimization for learning and data analysis, global optimization

Procedia PDF Downloads 203
23933 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data

Authors: Saeid Gharechelou, Ryutaro Tateishi

Abstract:

Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.

Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid damage monitoring, 2015-Nepal earthquake

Procedia PDF Downloads 165
23932 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System

Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi

Abstract:

Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.

Keywords: channel estimation, OFDM, pilot-assist, VLC

Procedia PDF Downloads 175
23931 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition

Authors: A. Shoiynbek, K. Kozhakhmet, P. Menezes, D. Kuanyshbay, D. Bayazitov

Abstract:

Speech emotion recognition has received increasing research interest all through current years. There was used emotional speech that was collected under controlled conditions in most research work. Actors imitating and artificially producing emotions in front of a microphone noted those records. There are four issues related to that approach, namely, (1) emotions are not natural, and it means that machines are learning to recognize fake emotions. (2) Emotions are very limited by quantity and poor in their variety of speaking. (3) There is language dependency on SER. (4) Consequently, each time when researchers want to start work with SER, they need to find a good emotional database on their language. In this paper, we propose the approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describe the sequence of actions of the proposed approach. One of the first objectives of the sequence of actions is a speech detection issue. The paper gives a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian languages. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To illustrate the working capacity of the developed model, we have performed an analysis of speech detection and extraction from real tasks.

Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset

Procedia PDF Downloads 93
23930 Development of an Aptamer-Molecularly Imprinted Polymer Based Electrochemical Sensor to Detect Pathogenic Bacteria

Authors: Meltem Agar, Maisem Laabei, Hannah Leese, Pedro Estrela

Abstract:

Pathogenic bacteria and the diseases they cause have become a global problem. Their early detection is vital and can only be possible by detecting the bacteria causing the disease accurately and rapidly. Great progress has been made in this field with the use of biosensors. Molecularly imprinted polymers have gain broad interest because of their excellent properties over natural receptors, such as being stable in a variety of conditions, inexpensive, biocompatible and having long shelf life. These properties make molecularly imprinted polymers an attractive candidate to be used in biosensors. In this study it is aimed to produce an aptamer-molecularly imprinted polymer based electrochemical sensor by utilizing the properties of molecularly imprinted polymers coupled with the enhanced specificity offered by DNA aptamers. These ‘apta-MIP’ sensors were used for the detection of Staphylococcus aureus and Escherichia coli. The experimental parameters for the fabrication of sensor were optimized, and detection of the bacteria was evaluated via Electrochemical Impedance Spectroscopy. Sensitivity and selectivity experiments were conducted. Furthermore, molecularly imprinted polymer only and aptamer only electrochemical sensors were produced separately, and their performance were compared with the electrochemical sensor produced in this study. Aptamer-molecularly imprinted polymer based electrochemical sensor showed good sensitivity and selectivity in terms of detection of Staphylococcus aureus and Escherichia coli. The performance of the sensor was assessed in buffer solution and tap water.

Keywords: aptamer, electrochemical sensor, staphylococcus aureus, molecularly imprinted polymer

Procedia PDF Downloads 114
23929 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO

Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky

Abstract:

The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.

Keywords: aeronautics, big data, data processing, machine learning, S1000D

Procedia PDF Downloads 141
23928 Quantum Kernel Based Regressor for Prediction of Non-Markovianity of Open Quantum Systems

Authors: Diego Tancara, Raul Coto, Ariel Norambuena, Hoseein T. Dinani, Felipe Fanchini

Abstract:

Quantum machine learning is a growing research field that aims to perform machine learning tasks assisted by a quantum computer. Kernel-based quantum machine learning models are paradigmatic examples where the kernel involves quantum states, and the Gram matrix is calculated from the overlapping between these states. With the kernel at hand, a regular machine learning model is used for the learning process. In this paper we investigate the quantum support vector machine and quantum kernel ridge models to predict the degree of non-Markovianity of a quantum system. We perform digital quantum simulation of amplitude damping and phase damping channels to create our quantum dataset. We elaborate on different kernel functions to map the data and kernel circuits to compute the overlapping between quantum states. We observe a good performance of the models.

Keywords: quantum, machine learning, kernel, non-markovianity

Procedia PDF Downloads 172
23927 Prediction of the Dark Matter Distribution and Fraction in Individual Galaxies Based Solely on Their Rotation Curves

Authors: Ramzi Suleiman

Abstract:

Recently, the author proposed an observationally-based relativity theory termed information relativity theory (IRT). The theory is simple and is based only on basic principles, with no prior axioms and no free parameters. For the case of a body of mass in uniform rectilinear motion relative to an observer, the theory transformations uncovered a matter-dark matter duality, which prescribes that the sum of the densities of the body's baryonic matter and dark matter, as measured by the observer, is equal to the body's matter density at rest. It was shown that the theory transformations were successful in predicting several important phenomena in small particle physics, quantum physics, and cosmology. This paper extends the theory transformations to the cases of rotating disks and spheres. The resulting transformations for a rotating disk are utilized to derive predictions of the radial distributions of matter and dark matter densities in rotationally supported galaxies based solely on their observed rotation curves. It is also shown that for galaxies with flattening curves, good approximations of the radial distributions of matter and dark matter and of the dark matter fraction could be obtained from one measurable scale radius. Test of the model on five galaxies, chosen randomly from the SPARC database, yielded impressive predictions. The rotation curves of all the investigated galaxies emerged as accurate traces of the predicted radial density distributions of their dark matter. This striking result raises an intriguing physical explanation of gravity in galaxies, according to which it is the proximal drag of the stars and gas in the galaxy by its rotating dark matter web. We conclude by alluding briefly to the application of the proposed model to stellar systems and black holes. This study also hints at the potential of the discovered matter-dark matter duality in fixing the standard model of elementary particles in a natural manner without the need for hypothesizing about supersymmetric particles.

Keywords: dark matter, galaxies rotation curves, SPARC, rotating disk

Procedia PDF Downloads 74
23926 Mobile Agents-Based Framework for Dynamic Resource Allocation in Cloud Computing

Authors: Safia Rabaaoui, Héla Hachicha, Ezzeddine Zagrouba

Abstract:

Nowadays, cloud computing is becoming the more popular technology to various companies and consumers, which benefit from its increased efficiency, cost optimization, data security, unlimited storage capacity, etc. One of the biggest challenges of cloud computing is resource allocation. Its efficiency directly influences the performance of the whole cloud environment. Finding an effective method to address these critical issues and increase cloud performance was necessary. This paper proposes a mobile agents-based framework for dynamic resource allocation in cloud computing to minimize both the cost of using virtual machines and the makespan. Furthermore, its impact on the best response time and power consumption has been studied. The simulation showed that our method gave better results than here.

Keywords: cloud computing, multi-agent system, mobile agent, dynamic resource allocation, cost, makespan

Procedia PDF Downloads 96
23925 Debris Flow Mapping Using Geographical Information System Based Model and Geospatial Data in Middle Himalayas

Authors: Anand Malik

Abstract:

The Himalayas with high tectonic activities poses a great threat to human life and property. Climate change is another reason which triggering extreme events multiple fold effect on high mountain glacial environment, rock falls, landslides, debris flows, flash flood and snow avalanches. One such extreme event of cloud burst along with breach of moraine dammed Chorabri Lake occurred from June 14 to June 17, 2013, triggered flooding of Saraswati and Mandakini rivers in the Kedarnath Valley of Rudraprayag district of Uttrakhand state of India. As a result, huge volume of water with its high velocity created a catastrophe of the century, which resulted into loss of large number of human/animals, pilgrimage, tourism, agriculture and property. Thus a comprehensive assessment of debris flow hazards requires GIS-based modeling using numerical methods. The aim of present study is to focus on analysis and mapping of debris flow movements using geospatial data with flow-r (developed by team at IGAR, University of Lausanne). The model is based on combined probabilistic and energetic algorithms for the assessment of spreading of flow with maximum run out distances. Aster Digital Elevation Model (DEM) with 30m x 30m cell size (resolution) is used as main geospatial data for preparing the run out assessment, while Landsat data is used to analyze land use land cover change in the study area. The results of the study area show that model can be applied with great accuracy as the model is very useful in determining debris flow areas. The results are compared with existing available landslides/debris flow maps. ArcGIS software is used in preparing run out susceptibility maps which can be used in debris flow mitigation and future land use planning.

Keywords: debris flow, geospatial data, GIS based modeling, flow-R

Procedia PDF Downloads 266
23924 Local Community's Response on Post-Disaster and Role of Social Capital towards Recovery Process: A Case Study of Kaminani Community in Bhaktapur Municipality after 2015 Gorkha Nepal Earthquake

Authors: Lata Shakya, Toshio Otsuki, Saori Imoto, Bijaya Krishna Shrestha, Umesh Bahadur Malla

Abstract:

2015 Gorkha Nepal earthquake have damaged the human settlements in 14 districts of Nepal. Historic core areas of three principal cities namely Kathmandu, Lalitpur and Bhaktapur including numerous traditional ‘newari’ settlements in the peripheral areas have been either collapsed or severely damaged. Despite Government of Nepal and (international) non-government organisations’ attempt towards disaster risk management through the preparation of policies and guidelines and implementation of community-based activities, the recent ‘Gorkha’ earthquake has demonstrated the inadequate preparedness, poor implementation of a legal instrument, resource constraints, and managerial weakness. However, the social capital through community based institutions, self-help attitude, and community bond has helped a lot not only in rescue and relief operation but also in a post-disaster temporary shelter living thereby exhibiting the resilient power of the local community. Conducting a detailed case study of ‘Kaminani’ community with 42 houses at ward no. 16 of Bhaktapur municipality, this paper analyses the local community’s response and activities on the Gorkha earthquake in rescue and relief operation as well as in post disaster work. Leadership, the existence of internal/external aid, physical and human support are also analyzed. Social resource and networking are also explained through critical review of the existing community organisation and their activities. The research methodology includes literature review, field survey, and interview with community leaders and residents based on a semi-structured questionnaire. The study reveals that community carried their recovery process in four different phases: (i) management of emergency evacuation, (ii) constructing community owed temporary shelter for individuals, (iii) demolishing upper floors of the damaged houses, and (iv) planning for collaborative housing reconstruction. As territorial based organization, religion based agency and aim based institution exist in the survey area from pre-disaster time, it can be assumed that the community activists including leaders are well experienced to create aim-based group and manage teamwork to deal with various issues and problems collaboratively. Physical and human support including partial financial aid from external source as a result of community leader’s personal networking is extended to the community members. Thus, human/social resource and personal/social network play a crucial role in the recovery process. And to build such social capital, community should have potential from pre-disaster time.

Keywords: Gorkha Nepal earthquake, local community, recovery process, social resource, social network

Procedia PDF Downloads 247
23923 Nanotechnolgy for Energy Harvesting Applications

Authors: Eiman Nour

Abstract:

The rising interest in harvesting power is because of the capabilities application of expanding self-powered systems based on nanostructures. Using renewable and self-powered sources is necessary for the growth of green electronics and could be of the capability to wireless sensor networks. The ambient mechanical power is among the ample sources for various power harvesting device configurations that are published. In this work, we design and fabricate a paper-based nanogenerator (NG) utilizing piezoelectric zinc oxide (ZnO) nanowires (NWs) grown hydrothermally on a paper substrate. The fabricated NG can harvest ambient mechanical energy from various kinds of human motions, such as handwriting. The fabricated NG from a single ZnO NWs/PVDF-TrFE NG has been used firstly as handwriting-driven NG. The mechanical pressure applied on the paper platform while handwriting is harvested by the NG to deliver electrical energy; depending on the mode of handwriting, a maximum harvested voltage of 4.8 V was obtained.

Keywords: nanostructure, zinc oxide, nanogenerator, energy harvesting

Procedia PDF Downloads 57
23922 The Twin Terminal of Pedestrian Trajectory Based on City Intelligent Model (CIM) 4.0

Authors: Chen Xi, Lao Xuerui, Li Junjie, Jiang Yike, Wang Hanwei, Zeng Zihao

Abstract:

To further promote the development of smart cities, the microscopic "nerve endings" of the City Intelligent Model (CIM) are extended to be more sensitive. In this paper, we develop a pedestrian trajectory twin terminal based on the CIM and CNN technology. It also uses 5G networks, architectural and geoinformatics technologies, convolutional neural networks, combined with deep learning networks for human behaviour recognition models, to provide empirical data such as 'pedestrian flow data and human behavioural characteristics data', and ultimately form spatial performance evaluation criteria and spatial performance warning systems, to make the empirical data accurate and intelligent for prediction and decision making.

Keywords: urban planning, urban governance, CIM, artificial intelligence, convolutional neural network

Procedia PDF Downloads 132
23921 Characterization of Inertial Confinement Fusion Targets Based on Transmission Holographic Mach-Zehnder Interferometer

Authors: B. Zare-Farsani, M. Valieghbal, M. Tarkashvand, A. H. Farahbod

Abstract:

To provide the conditions for nuclear fusion by high energy and powerful laser beams, it is required to have a high degree of symmetry and surface uniformity of the spherical capsules to reduce the Rayleigh-Taylor hydrodynamic instabilities. In this paper, we have used the digital microscopic holography based on Mach-Zehnder interferometer to study the quality of targets for inertial fusion. The interferometric pattern of the target has been registered by a CCD camera and analyzed by Holovision software. The uniformity of the surface and shell thickness are investigated and measured in reconstructed image. We measured shell thickness in different zone where obtained non uniformity 22.82 percent.  

Keywords: inertial confinement fusion, mach-zehnder interferometer, digital holographic microscopy, image reconstruction, holovision

Procedia PDF Downloads 301
23920 Assessing the Impact of Low Carbon Technology Integration on Electricity Distribution Networks: Advancing towards Local Area Energy Planning

Authors: Javier Sandoval Bustamante, Pardis Sheikhzadeh, Vijayanarasimha Hindupur Pakka

Abstract:

In the pursuit of achieving net-zero carbon emissions, the integration of low carbon technologies into electricity distribution networks is paramount. This paper delves into the critical assessment of how the integration of low carbon technologies, such as heat pumps, electric vehicle chargers, and photovoltaic systems, impacts the infrastructure and operation of electricity distribution networks. The study employs rigorous methodologies, including power flow analysis and headroom analysis, to evaluate the feasibility and implications of integrating these technologies into existing distribution systems. Furthermore, the research utilizes Local Area Energy Planning (LAEP) methodologies to guide local authorities and distribution network operators in formulating effective plans to meet regional and national decarbonization objectives. Geospatial analysis techniques, coupled with building physics and electric energy systems modeling, are employed to develop geographic datasets aimed at informing the deployment of low carbon technologies at the local level. Drawing upon insights from the Local Energy Net Zero Accelerator (LENZA) project, a comprehensive case study illustrates the practical application of these methodologies in assessing the rollout potential of LCTs. The findings not only shed light on the technical feasibility of integrating low carbon technologies but also provide valuable insights into the broader transition towards a sustainable and electrified energy future. This paper contributes to the advancement of knowledge in power electrical engineering by providing empirical evidence and methodologies to support the integration of low carbon technologies into electricity distribution networks. The insights gained are instrumental for policymakers, utility companies, and stakeholders involved in navigating the complex challenges of energy transition and achieving long-term sustainability goals.

Keywords: energy planning, energy systems, digital twins, power flow analysis, headroom analysis

Procedia PDF Downloads 47
23919 The History of Sambipitu Formation Temperature during the Early Miocene Epooch at Kali Ngalang, Nglipar, Gunung Kidul Regency

Authors: R. Harman Dwi, Ryan Avirsa, P. Abraham Ivan

Abstract:

Understanding of temperatures in the past, present, and future temperatures can be possible to do by analysis abundance of fossil foraminifera. This research was conducted in Sambipitu Formation, Ngalang River, Nglipar, Gunung Kidul Regency. The research method is divided into 3 stages: 1) study of literature, research based on previous researchers, 2) spatial, observation and sampling every 5-10 meters, 3) descriptive, analyzing samples consisting of a 10-gram sample weight, washing sample using 30% peroxide, biostratigraphy analysis, paleotemperature analysis using abundance of fossil, diversity analysis using Simpson diversity index method, and comparing current temperature data. There are two phases based on the appearance of Globorotalia menardii and Pulleniatina obliqueculata pointed to Phase Tropical Area, and the appearance of fossil Globigerinoides ruber and Orbulina universa fossil shows the phase of Subtropical Area. Paleotemperatur based on the appearance of Globorotalia menardii, Globigerinoides trilobus, Globigerinoides ruber, Orbulina universa, and Pulleniatina obliqueculata pointed to Warm Water Area and Warm Water Area (average surface water approximate 25°C).

Keywords: abundance, biostratigraphy, Simpson diversity index method, paleotemperature

Procedia PDF Downloads 170
23918 Definition of Service Angle of Android’S Robot Hand by Method of Small Movements of Gripper’S Axis Synthesis by Speed Vector

Authors: Valeriy Nebritov

Abstract:

The paper presents a generalized method for determining the service solid angle based on the assigned gripper axis orientation with a stationary grip center. Motion synthesis in this work is carried out in the vector of velocities. As an example, a solid angle of the android robot arm is determined, this angle being formed by the longitudinal axis of a gripper. The nature of the method is based on the study of sets of configuration positions, defining the end point positions of the unit radius sphere sweep, which specifies the service solid angle. From this the spherical curve specifying the shape of the desired solid angle was determined. The results of the research can be used in the development of control systems of autonomous android robots.

Keywords: android robot, control systems, motion synthesis, service angle

Procedia PDF Downloads 191
23917 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare

Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl

Abstract:

Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.

Keywords: average run length (ARL), bernoulli cusum (BC) chart, beta binomial posterior predictive (BBPP) distribution, clinical indicator (CI), healthcare organization (HCO), highest posterior density (HPD) interval

Procedia PDF Downloads 199