Search results for: separately excited synchronous machine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3699

Search results for: separately excited synchronous machine

1179 Performance of Neural Networks vs. Radial Basis Functions When Forming a Metamodel for Residential Buildings

Authors: Philip Symonds, Jon Taylor, Zaid Chalabi, Michael Davies

Abstract:

With the world climate projected to warm and major cities in developing countries becoming increasingly populated and polluted, governments are tasked with the problem of overheating and air quality in residential buildings. This paper presents the development of an adaptable model of these risks. Simulations are performed using the EnergyPlus building physics software. An accurate metamodel is formed by randomly sampling building input parameters and training on the outputs of EnergyPlus simulations. Metamodels are used to vastly reduce the amount of computation time required when performing optimisation and sensitivity analyses. Neural Networks (NNs) are compared to a Radial Basis Function (RBF) algorithm when forming a metamodel. These techniques were implemented using the PyBrain and scikit-learn python libraries, respectively. NNs are shown to perform around 15% better than RBFs when estimating overheating and air pollution metrics modelled by EnergyPlus.

Keywords: neural networks, radial basis functions, metamodelling, python machine learning libraries

Procedia PDF Downloads 431
1178 SVID: Structured Vulnerability Intelligence for Building Deliberated Vulnerable Environment

Authors: Wenqing Fan, Yixuan Cheng, Wei Huang

Abstract:

The diversity and complexity of modern IT systems make it almost impossible for internal teams to find vulnerabilities in all software before the software is officially released. The emergence of threat intelligence and vulnerability reporting policy has greatly reduced the burden on software vendors and organizations to find vulnerabilities. However, to prove the existence of the reported vulnerability, it is necessary but difficult for security incident response team to build a deliberated vulnerable environment from the vulnerability report with limited and incomplete information. This paper presents a structured, standardized, machine-oriented vulnerability intelligence format, that can be used to automate the orchestration of Deliberated Vulnerable Environment (DVE). This paper highlights the important role of software configuration and proof of vulnerable specifications in vulnerability intelligence, and proposes a triad model, which is called DIR (Dependency Configuration, Installation Configuration, Runtime Configuration), to define software configuration. Finally, this paper has also implemented a prototype system to demonstrate that the orchestration of DVE can be automated with the intelligence.

Keywords: DIR triad model, DVE, vulnerability intelligence, vulnerability recurrence

Procedia PDF Downloads 106
1177 Scrutinizing the Effective Parameters on Cuttings Movement in Deviated Wells: Experimental Study

Authors: Siyamak Sarafraz, Reza Esmaeil Pour, Saeed Jamshidi, Asghar Molaei Dehkordi

Abstract:

Cutting transport is one of the major problems in directional and extended reach oil and gas wells. Lack of sufficient attention to this issue may bring some troubles such as casing running, stuck pipe, excessive torque and drag, hole pack off, bit wear, decreased the rate of penetration (ROP), increased equivalent circulation density (ECD) and logging. Since it is practically impossible to directly observe the behavior of deep wells, a test setup was designed to investigate cutting transport phenomena. This experimental work carried out to scrutiny behavior of the effective variables in cutting transport. The test setup contained a test section with 17 feet long that made of a 3.28 feet long transparent glass pipe with 3 inch diameter, a storage tank with 100 liters capacity, drill pipe rotation which made of stainless steel with 1.25 inches diameter, pump to circulate drilling fluid, valve to adjust flow rate, bit and a camera to record all events which then converted to RGB images via the Image Processing Toolbox. After preparation of test process, each test performed separately, and weights of the output particles were measured and compared with each other. Observation charts were plotted to assess the behavior of viscosity, flow rate and RPM in inclinations of 0°, 30°, 60° and 90°. RPM was explored with other variables such as flow rate and viscosity in different angles. Also, effect of different flow rate was investigated in directional conditions. To access the precise results, captured image were analyzed to find out bed thickening and particles behave in the annulus. The results of this experimental study demonstrate that drill string rotation helps particles to be suspension and reduce the particle deposition cutting movement increased significantly. By raising fluid velocity, laminar flow converted to turbulence flow in the annulus. Increases in flow rate in horizontal section by considering a lower range of viscosity is more effective and improved cuttings transport performance.

Keywords: cutting transport, directional drilling, flow rate, hole cleaning, pipe rotation

Procedia PDF Downloads 270
1176 The Geometrical Cosmology: The Projective Cast of the Collective Subjectivity of the Chinese Traditional Architectural Drawings

Authors: Lina Sun

Abstract:

Chinese traditional drawings related to buildings and construction apply a unique geometry differentiating with western Euclidean geometry and embrace a collection of special terminologies, under the category of tu (the Chinese character for drawing). This paper will on one side etymologically analysis the terminologies of Chinese traditional architectural drawing, and on the other side geometrically deconstruct the composition of tu and locate the visual narrative language of tu in the pictorial tradition. The geometrical analysis will center on selected series of Yang-shi-lei tu of the construction of emperors’ mausoleums in Qing Dynasty (1636-1912), and will also draw out the earlier architectural drawings and the architectural paintings such as the jiehua, and paintings on religious frescoes and tomb frescoes as the comparison. By doing these, this research will reveal that both the terminologies corresponding to different geometrical forms respectively indicate associations between architectural drawing and the philosophy of Chinese cosmology, and the arrangement of the geometrical forms in the visual picture plane facilitates expressions of the concepts of space and position in the geometrical cosmology. These associations and expressions are the collective intentions of architectural drawing evolving in the thousands of years’ tradition without breakage and irrelevant to the individual authorship. Moreover, the architectural tu itself as an entity, not only functions as the representation of the buildings but also express intentions and strengthen them by using the Chinese unique geometrical language flexibly and intentionally. These collective cosmological spatial intentions and the corresponding geometrical words and languages reveal that the Chinese traditional architectural drawing functions as a unique architectural site with subjectivity which exists parallel with buildings and express intentions and meanings by itself. The methodology and the findings of this research will, therefore, challenge the previous researches which treat architectural drawings just as the representation of buildings and understand the drawings more than just using them as the evidence to reconstruct the information of buildings. Furthermore, this research will situate architectural drawing in between the researches of Chinese technological tu and artistic painting, bridging the two academic areas which usually treated the partial features of architectural drawing separately. Beyond this research, the collective subjectivity of the Chinese traditional drawings will facilitate the revealing of the transitional experience from traditions to drawing modernity, where the individual subjective identities and intentions of architects arise. This research will root for the understanding both the ambivalence and affinity of the drawing modernity encountering the traditions.

Keywords: Chinese traditional architectural drawing (tu), etymology of tu, collective subjectivity of tu, geometrical cosmology in tu, geometry and composition of tu, Yang-shi-lei tu

Procedia PDF Downloads 106
1175 Modelling of Powered Roof Supports Work

Authors: Marcin Michalak

Abstract:

Due to the increasing efforts on saving our natural environment a change in the structure of energy resources can be observed - an increasing fraction of a renewable energy sources. In many countries traditional underground coal mining loses its significance but there are still countries, like Poland or Germany, in which the coal based technologies have the greatest fraction in a total energy production. This necessitates to make an effort to limit the costs and negative effects of underground coal mining. The longwall complex is as essential part of the underground coal mining. The safety and the effectiveness of the work is strongly dependent of the diagnostic state of powered roof supports. The building of a useful and reliable diagnostic system requires a lot of data. As the acquisition of a data of any possible operating conditions it is important to have a possibility to generate a demanded artificial working characteristics. In this paper a new approach of modelling a leg pressure in the single unit of powered roof support. The model is a result of the analysis of a typical working cycles.

Keywords: machine modelling, underground mining, coal mining, structure

Procedia PDF Downloads 348
1174 An Advanced Numerical Tool for the Design of Through-Thickness Reinforced Composites for Electrical Applications

Authors: Bing Zhang, Jingyi Zhang, Mudan Chen

Abstract:

Fibre-reinforced polymer (FRP) composites have been extensively utilised in various industries due to their high specific strength, e.g., aerospace, renewable energy, automotive, and marine. However, they have relatively low electrical conductivity than metals, especially in the out-of-plane direction. Conductive metal strips or meshes are typically employed to protect composites when designing lightweight structures that may be subjected to lightning strikes, such as composite wings. Unfortunately, this approach downplays the lightweight advantages of FRP composites, thereby limiting their potential applications. Extensive studies have been undertaken to improve the electrical conductivity of FRP composites. The authors are amongst the pioneers who use through-thickness reinforcement (TTR) to tailor the electrical conductivity of composites. Compared to the conventional approaches using conductive fillers, the through-thickness reinforcement approach has been proven to be able to offer a much larger improvement to the through-thickness conductivity of composites. In this study, an advanced high-fidelity numerical modelling strategy is presented to investigate the effects of through-thickness reinforcement on both the in-plane and out-of-plane electrical conductivities of FRP composites. The critical micro-structural features of through-thickness reinforced composites incorporated in the modelling framework are 1) the fibre waviness formed due to TTR insertion; 2) the resin-rich pockets formed due to resin flow in the curing process following TTR insertion; 3) the fibre crimp, i.e., fibre distortion in the thickness direction of composites caused by TTR insertion forces. In addition, each interlaminar interface is described separately. An IMA/M21 composite laminate with a quasi-isotropic stacking sequence is employed to calibrate and verify the modelling framework. The modelling results agree well with experimental measurements for bothering in-plane and out-plane conductivities. It has been found that the presence of conductive TTR can increase the out-of-plane conductivity by around one order, but there is less improvement in the in-plane conductivity, even at the TTR areal density of 0.1%. This numerical tool provides valuable references as a design tool for through-thickness reinforced composites when exploring their electrical applications. Parametric studies are undertaken using the numerical tool to investigate critical parameters that affect the electrical conductivities of composites, including TTR material, TTR areal density, stacking sequence, and interlaminar conductivity. Suggestions regarding the design of electrical through-thickness reinforced composites are derived from the numerical modelling campaign.

Keywords: composite structures, design, electrical conductivity, numerical modelling, through-thickness reinforcement

Procedia PDF Downloads 72
1173 Algorithm for Path Recognition in-between Tree Rows for Agricultural Wheeled-Mobile Robots

Authors: Anderson Rocha, Pedro Miguel de Figueiredo Dinis Oliveira Gaspar

Abstract:

Machine vision has been widely used in recent years in agriculture, as a tool to promote the automation of processes and increase the levels of productivity. The aim of this work is the development of a path recognition algorithm based on image processing to guide a terrestrial robot in-between tree rows. The proposed algorithm was developed using the software MATLAB, and it uses several image processing operations, such as threshold detection, morphological erosion, histogram equalization and the Hough transform, to find edge lines along tree rows on an image and to create a path to be followed by a mobile robot. To develop the algorithm, a set of images of different types of orchards was used, which made possible the construction of a method capable of identifying paths between trees of different heights and aspects. The algorithm was evaluated using several images with different characteristics of quality and the results showed that the proposed method can successfully detect a path in different types of environments.

Keywords: agricultural mobile robot, image processing, path recognition, hough transform

Procedia PDF Downloads 130
1172 'You’re Not Alone': Peer Feedback Practices for Cross-Cultural Writing Classrooms and Centers

Authors: Cassandra Branham, Danielle Farrar

Abstract:

As writing instructors and writing center administrators at a large research university with a significant population of English language learners (ELLs), we are interested in how peer feedback pedagogy can be effectively translated for writing center purposes, as well as how various modes of peer feedback can enrich the learning experiences of L1 and L2 writers in these spaces. Although peer feedback is widely used in classrooms and centers, instructor, student, and researcher opinions vary in respect to its effectiveness. We argue that peer feedback - traditional and digital, synchronous and asynchronous - is an indispensable element for both classrooms and centers and emphasize that it should occur with both L1 and L2 students to further develop an array of reading and writing skills. We also believe that further understanding of the best practices of peer feedback in such cross-cultural spaces, like the classroom and center, can optimize the benefits of peer feedback. After a critical review of the literature, we implemented an embedded tutoring program in our university’s writing center in collaboration with its First-Year Composition (FYC) program and Language Institute. The embedded tutoring program matches a graduate writing consultant with L1 and L2 writers enrolled in controlled-matriculation composition courses where ELLs make up at least 50% of each class. Furthermore, this program is informed by what we argue to be some best practices of peer feedback for both classroom and center purposes, including expectation-based training through rubrics, modeling effective feedback, hybridizing traditional and digital modes of feedback, recognizing the significance the body in composition (what we call writer embodiment), and maximizing digital technologies to exploit extended cognition. After conducting surveys and follow-up interviews with students, instructors, and writing consultants in the embedded tutoring program, we found that not only did students see an increased value in peer feedback, but also instructors saw an improvement in both writing style and critical thinking skills. Our L2 participants noted improvements in language acquisition while our L1 students recognized a broadening of their worldviews. We believe that both L1 and L2 students developed self-efficacy and agency in their identities as writers because they gained confidence in their abilities to offer feedback, as well as in the legitimacy of feedback they received from peers. We also argue that these best practices situate novice writers as experts, as writers become a valued and integral part of the revision process with their own and their peers’ papers. Finally, the use of iPads in embedded tutoring recovered the importance of the body and its senses in writing; the highly sensory feedback from these multi-modal sessions that offer audio and visual input underscores the significant role both the body and mind play in compositional practices. After beginning with a brief review of the literature that sparked this research, this paper will discuss the embedded tutoring program in detail, report on the results of the pilot program, and will conclude with a discussion of the pedagogical implications that arise from this research for both classroom and center.

Keywords: English language learners, peer feedback, writing center, writing classroom

Procedia PDF Downloads 390
1171 A Semi-supervised Classification Approach for Trend Following Investment Strategy

Authors: Rodrigo Arnaldo Scarpel

Abstract:

Trend following is a widely accepted investment strategy that adopts a rule-based trading mechanism that rather than striving to predict market direction or on information gathering to decide when to buy and when to sell a stock. Thus, in trend following one must respond to market’s movements that has recently happen and what is currently happening, rather than on what will happen. Optimally, in trend following strategy, is to catch a bull market at its early stage, ride the trend, and liquidate the position at the first evidence of the subsequent bear market. For applying the trend following strategy one needs to find the trend and identify trade signals. In order to avoid false signals, i.e., identify fluctuations of short, mid and long terms and to separate noise from real changes in the trend, most academic works rely on moving averages and other technical analysis indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI) to uncover intelligible stock trading rules following trend following strategy philosophy. Recently, some works has applied machine learning techniques for trade rules discovery. In those works, the process of rule construction is based on evolutionary learning which aims to adapt the rules to the current environment and searches for the global optimum rules in the search space. In this work, instead of focusing on the usage of machine learning techniques for creating trading rules, a time series trend classification employing a semi-supervised approach was used to early identify both the beginning and the end of upward and downward trends. Such classification model can be employed to identify trade signals and the decision-making procedure is that if an up-trend (down-trend) is identified, a buy (sell) signal is generated. Semi-supervised learning is used for model training when only part of the data is labeled and Semi-supervised classification aims to train a classifier from both the labeled and unlabeled data, such that it is better than the supervised classifier trained only on the labeled data. For illustrating the proposed approach, it was employed daily trade information, including the open, high, low and closing values and volume from January 1, 2000 to December 31, 2022, of the São Paulo Exchange Composite index (IBOVESPA). Through this time period it was visually identified consistent changes in price, upwards or downwards, for assigning labels and leaving the rest of the days (when there is not a consistent change in price) unlabeled. For training the classification model, a pseudo-label semi-supervised learning strategy was used employing different technical analysis indicators. In this learning strategy, the core is to use unlabeled data to generate a pseudo-label for supervised training. For evaluating the achieved results, it was considered the annualized return and excess return, the Sortino and the Sharpe indicators. Through the evaluated time period, the obtained results were very consistent and can be considered promising for generating the intended trading signals.

Keywords: evolutionary learning, semi-supervised classification, time series data, trading signals generation

Procedia PDF Downloads 70
1170 The Role of Cognitive Control and Social Camouflage Associated with Social Anxiety Autism Spectrum Conditions

Authors: Siqing Guan, Fumiyo Oshima, Eiji Shimizu, Nozomi Tomita, Toru Takahashi, Hiroaki Kumano

Abstract:

Risk factors for social anxiety in autism spectrum conditions involve executive attention, emotion regulation, and thought regulation as processes of cognitive dysregulation. Social camouflaging behaviors as strategies used to mask and/or compensate for autism characteristics during social interactions in autism spectrum conditions have also been emphasized. However, the role of cognitive dysregulation and social camouflaging related to social anxiety in autism spectrum conditions has not been clarified. Whether these factors are specific to social anxiety in autism spectrum conditions or common to social anxiety independent of autism spectrum conditions needs to be clarified. Here, we explored risk factors specific to social anxiety in autism spectrum conditions and general risk factors for social anxiety independent of autism spectrum conditions. From the Japanese participants in early adulthood (age=18~39) of the online survey in Japan, those who exceeded the Japanese version Autism-Spectrum Quotient cutoff (33 points or more )were divided into the autism spectrum conditions group (ASC; N=255, mean age=32.08, SD age=5.16)and those who did not exceed the cutoff were divided into the non-autism spectrum conditions group (Non-ASC; N=255, mean age=31.70, SD age=5.09). Using the Japanese versions of the Social Phobia Scale, the Social Interaction Anxiety Scale, and the Short Fear of Negative Evaluation Scale, a composite score for social anxiety was calculated using a method of principal. We also measured emotional control difficulties using the Difficulties in Emotion Regulation Scale, executive attention using the Effortful Control Scale for Adults, rumination using the Rumination-Reflection Questionnaire, and worry using the Penn State Worry Questionnaire. This study was passed through the review of the Ethics Committee. No conflicts of interest. Multiple regression analysis with forced entry method was used to predict social anxiety in the ASC and non-ASC groups separately, based on executive attention, emotion dysregulation, worry, rumination, and social camouflage. In the ASC group, emotion dysregulation (β=.277, p<.001), worry (β=.162, p<.05), assimilation (β=.308, p<.001) and masking (β=.275, p<.001) were significant predictors of social anxiety (F (7,247) = 45.791, p <.001, R2=.565). In the non-ASC groups,emotion dysregulation (β=.171, p<.05), worry (β=.344,p <.001), assimilation (β=.366,p <.001) and executive attention (β=-.132,p <.05) were significant predictors of social anxiety (F (7,207) =47.333, p <.001, R2=.615).The findings suggest that masking was shown to be a risk factor for social anxiety specific to autism spectrum conditions, while emotion dysregulation, worry, and assimilation were shown to be common risk factors for social anxiety, regardless of autism spectrum conditions. In addition, executive attention is a risk factor for social anxiety without autism spectrum conditions.

Keywords: autism spectrum, cognitive control, social anxiety, social camouflaging

Procedia PDF Downloads 192
1169 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model

Authors: Sujay Kotwale, Ramasubba Reddy M.

Abstract:

Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.

Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost

Procedia PDF Downloads 98
1168 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 333
1167 Online Learning Versus Face to Face Learning: A Sentiment Analysis on General Education Mathematics in the Modern World of University of San Carlos School of Arts and Sciences Students Using Natural Language Processing

Authors: Derek Brandon G. Yu, Clyde Vincent O. Pilapil, Christine F. Peña

Abstract:

College students of Cebu province have been indoors since March 2020, and a challenge encountered is the sudden shift from face to face to online learning and with the lack of empirical data on online learning on Higher Education Institutions (HEIs) in the Philippines. Sentiments on face to face and online learning will be collected from University of San Carlos (USC), School of Arts and Sciences (SAS) students regarding Mathematics in the Modern World (MMW), a General Education (GE) course. Natural Language Processing with machine learning algorithms will be used to classify the sentiments of the students. Results of the research study are the themes identified through topic modelling and the overall sentiments of the students in USC SAS

Keywords: natural language processing, online learning, sentiment analysis, topic modelling

Procedia PDF Downloads 225
1166 The Proactive Approach of Digital Forensics Methodology against Targeted Attack Malware

Authors: Mohamed Fadzlee Sulaiman, Mohd Zabri Adil Talib, Aswami Fadillah Mohd Ariffin

Abstract:

Each individual organization has their own mechanism to build up cyber defense capability in protecting their information infrastructures from data breaches and cyber espionage. But, we can not deny the possibility of failing to detect and stop cyber attacks especially for those targeting credential information and intellectual property (IP). In this paper, we would like to share the modern approach of effective digital forensic methodology in order to identify the artifacts in tracing the trails of evidence while mitigating the infection from the target machine/s. This proposed approach will suit the digital forensic investigation to be conducted while resuming the business critical operation after mitigating the infection and minimizing the risk from the identified attack to transpire. Therefore, traditional digital forensics methodology has to be improvised to be proactive which not only focusing to discover the root caused and the threat actor but to develop the relevant mitigation plan in order to prevent from the same attack.

Keywords: digital forensic, detection, eradication, targeted attack, malware

Procedia PDF Downloads 260
1165 Radio Frequency Heating of Iron-Filled Carbon Nanotubes for Cancer Treatment

Authors: L. Szymanski, S. Wiak, Z. Kolacinski, G. Raniszewski, L. Pietrzak, Z. Staniszewska

Abstract:

There exist more than one hundred different types of cancer, and therefore no particular treatment is offered to people struggling with this disease. The character of treatment proposed to a patient will depend on a variety of factors such as type of the cancer diagnosed, advancement of the disease, its location in the body, as well as personal preferences of a patient. None of the commonly known methods of cancer-fighting is recognised as a perfect cure, however great advances in this field have been made over last few decades. Once a patient is diagnosed with cancer, he is in need of medical care and professional treatment for upcoming months, and in most cases even for years. Among the principal modes of treatment offered by medical centres, one can find radiotherapy, chemotherapy, and surgery. All of them can be applied separately or in combination, and the relative contribution of each is usually determined by medical specialist in agreement with a patient. In addition to the conventional treatment option, every day more complementary and alternative therapies are integrated into mainstream care. There is one promising cancer modality - hyperthermia therapy which is based on exposing body tissues to high temperatures. This treatment is still being investigated and is not widely available in hospitals and oncological centres. There are two kinds of hyperthermia therapies with direct and indirect heating. The first is not commonly used due to low efficiency and invasiveness, while the second is deeply investigated and a variety of methods have been developed, including ultrasounds, infrared sauna, induction heating and magnetic hyperthermia. The aim of this work was to examine possibilities of heating magnetic nanoparticles under the influence of electromagnetic field for cancer treatment. For this purpose, multiwalled carbon nanotubes used as nanocarriers for iron particles were investigated for its heating properties. The samples were subjected to an alternating electromagnetic field with frequency range between 110-619 kHz. Moreover, samples with various concentrations of carbon nanotubes were examined. The lowest frequency of 110 kHz and sample containing 10 wt% of carbon nanotubes occurred to influence the most effective heating process. Description of hyperthermia therapy aiming at enhancing currently available cancer treatment was also presented in this paper. Most widely applied conventional cancer modalities such as radiation or chemotherapy were also described. Methods for overcoming the most common obstacles in conventional cancer modalities, such as invasiveness and lack of selectivity, has been presented in magnetic hyperthermia characteristics, which explained the increasing interest of the treatment.

Keywords: hyperthermia, carbon nanotubes, cancer colon cells, ligands

Procedia PDF Downloads 255
1164 Exploring Acceptance of Artificial Intelligence Software Solution Amongst Healthcare Personnel: A Case in a Private Medical Centre

Authors: Sandra So, Mohd Roslan Ismail, Safurah Jaafar

Abstract:

With the rapid proliferation of data in healthcare has provided an opportune platform creation of Artificial Intelligence (AI). AI has brought a paradigm shift for healthcare professionals, promising improvement in delivery and quality. This study aims to determine the perception of healthcare personnel on perceived ease of use, perceived usefulness, and subjective norm toward attitude for artificial intelligence acceptance. A cross-sectional single institutional study of employees’ perception of adopting AI in the hospital was conducted. The survey was conducted using a questionnaire adapted from Technology Acceptance Model and a four-point Likert scale was used. There were 96 or 75.5% of the total population responded. This study has shown the significant relationship and the importance of ease of use, perceived usefulness, and subjective norm to the acceptance of AI. In the study results, it concluded that the determining factor to the strong acceptance of AI in their practices is mostly those respondents with the most interaction with the patients and clinical management.

Keywords: artificial intelligence, machine learning, perceived ease of use, perceived usefulness, subjective norm

Procedia PDF Downloads 209
1163 Improvement of Thermal Stability in Ethylene Methyl Acrylate Composites for Gasket Application

Authors: Pemika Ketsuwan, Pitt Supaphol, Manit Nithitanakul

Abstract:

A typical used of ethylene methyl acrylate (EMA) gasket is in the manufacture of optical lens, and often, they are deteriorated rapidly due to high temperature during the process. The objective of this project is to improve the thermal stability of the EMA copolymer gasket by preparing EMA with cellulose and silica composites. Hydroxy propyl methyl cellulose (HPMC) and Carboxy methyl cellulose (CMC) were used in preparing of EMA/cellulose composites and fumed silica (SiO2) was used in preparing EMA/silica composites with different amounts of filler (3, 5, 7, 10, 15 wt.%), using a twin screw extruder at 160 °C and the test specimens were prepared by the injection molding machine. The morphology and dispersion of fillers in the EMA matrix were investigated by field emission scanning electron microscopy (FESEM). The thermal stability of the composite was determined by thermal gravimetric analysis (TGA), and differential scanning calorimeter (DSC). Mechanical properties were evaluated by tensile testing. The developed composites were found to enhance thermal and mechanical properties when compared to that of the EMA copolymer alone.

Keywords: ethylene methyl acrylate, HPMC, Silica, Thermal stability

Procedia PDF Downloads 108
1162 Large Neural Networks Learning From Scratch With Very Few Data and Without Explicit Regularization

Authors: Christoph Linse, Thomas Martinetz

Abstract:

Recent findings have shown that Neural Networks generalize also in over-parametrized regimes with zero training error. This is surprising, since it is completely against traditional machine learning wisdom. In our empirical study we fortify these findings in the domain of fine-grained image classification. We show that very large Convolutional Neural Networks with millions of weights do learn with only a handful of training samples and without image augmentation, explicit regularization or pretraining. We train the architectures ResNet018, ResNet101 and VGG19 on subsets of the difficult benchmark datasets Caltech101, CUB_200_2011, FGVCAircraft, Flowers102 and StanfordCars with 100 classes and more, perform a comprehensive comparative study and draw implications for the practical application of CNNs. Finally, we show that VGG19 with 140 million weights learns to distinguish airplanes and motorbikes with up to 95% accuracy using only 20 training samples per class.

Keywords: convolutional neural networks, fine-grained image classification, generalization, image recognition, over-parameterized, small data sets

Procedia PDF Downloads 70
1161 Design and Manufacture Detection System for Patient's Unwanted Movements during Radiology and CT Scan

Authors: Anita Yaghobi, Homayoun Ebrahimian

Abstract:

One of the important tools that can help orthopedic doctors for diagnose diseases is imaging scan. Imaging techniques can help physicians in see different parts of the body, including the bones, muscles, tendons, nerves, and cartilage. During CT scan, a patient must be in the same position from the start to the end of radiation treatment. Patient movements are usually monitored by the technologists through the closed circuit television (CCTV) during scan. If the patient makes a small movement, it is difficult to be noticed by them. In the present work, a simple patient movement monitoring device is fabricated to monitor the patient movement. It uses an electronic sensing device. It continuously monitors the patient’s position while the CT scan is in process. The device has been retrospectively tested on 51 patients whose movement and distance were measured. The results show that 25 patients moved 1 cm to 2.5 cm from their initial position during the CT scan. Hence, the device can potentially be used to control and monitor patient movement during CT scan and Radiography. In addition, an audible alarm situated at the control panel of the control room is provided with this device to alert the technologists. It is an inexpensive, compact device which can be used in any CT scan machine.

Keywords: CT scan, radiology, X Ray, unwanted movement

Procedia PDF Downloads 448
1160 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines

Authors: Kamyar Tolouei, Ehsan Moosavi

Abstract:

In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.

Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization

Procedia PDF Downloads 89
1159 Digital Preservation: Requirement of 21st Century

Authors: Gaurav Kumar, Shilpa

Abstract:

Digital libraries have been established all over the world to create, maintain and to preserve the digital materials. This paper focuses on operational digital preservation systems specifically in educational organizations in India. It considers the broad range of digital objects including e-journals, technical reports, e-records, project documents, scientific data, etc. This paper describes the main objectives, process and technological issues involved in preservation of digital materials. Digital preservation refers to the various methods of keeping digital materials alive for the future. It includes everything from electronic publications on CD-ROM to Online database and collections of experimental data in digital format maintains the ability to display, retrieve and use digital collections in the face of rapidly changing technological and organizational infrastructures elements. This paper exhibits the importance and objectives of digital preservation. The necessities of preservation are hardware and software technology to interpret the digital documents and discuss various aspects of digital preservation.

Keywords: preservation, digital preservation, digital dark age, conservation, archive, repository, document, information technology, hardware, software, organization, machine readable format

Procedia PDF Downloads 436
1158 An Interpretable Data-Driven Approach for the Stratification of the Cardiorespiratory Fitness

Authors: D.Mendes, J. Henriques, P. Carvalho, T. Rocha, S. Paredes, R. Cabiddu, R. Trimer, R. Mendes, A. Borghi-Silva, L. Kaminsky, E. Ashley, R. Arena, J. Myers

Abstract:

The continued exploration of clinically relevant predictive models continues to be an important pursuit. Cardiorespiratory fitness (CRF) portends clinical vital information and as such its accurate prediction is of high importance. Therefore, the aim of the current study was to develop a data-driven model, based on computational intelligence techniques and, in particular, clustering approaches, to predict CRF. Two prediction models were implemented and compared: 1) the traditional Wasserman/Hansen Equations; and 2) an interpretable clustering approach. Data used for this analysis were from the 'FRIEND - Fitness Registry and the Importance of Exercise: The National Data Base'; in the present study a subset of 10690 apparently healthy individuals were utilized. The accuracy of the models was performed through the computation of sensitivity, specificity, and geometric mean values. The results show the superiority of the clustering approach in the accurate estimation of CRF (i.e., maximal oxygen consumption).

Keywords: cardiorespiratory fitness, data-driven models, knowledge extraction, machine learning

Procedia PDF Downloads 273
1157 A Broadband Tri-Cantilever Vibration Energy Harvester with Magnetic Oscillator

Authors: Xiaobo Rui, Zhoumo Zeng, Yibo Li

Abstract:

A novel tri-cantilever energy harvester with magnetic oscillator was presented, which could convert the ambient vibration into electrical energy to power the low-power devices such as wireless sensor networks. The most common way to harvest vibration energy is based on the use of linear resonant devices such as cantilever beam, since this structure creates the highest strain for a given force. The highest efficiency will be achieved when the resonance frequency of the harvester matches the vibration frequency. The limitation of the structure is the narrow effective bandwidth. To overcome this limitation, this article introduces a broadband tri-cantilever harvester with nonlinear stiffness. This energy harvester typically consists of three thin cantilever beams vertically arranged with Neodymium Magnets ( NdFeB)magnetics at its free end and a fixed base at the other end. The three cantilevers have different resonant frequencies by designed in different thicknesses. It is obviously that a similar advantage of multiple resonant frequencies as piezoelectric cantilevers array structure is built. To achieve broadband energy harvesting, magnetic interaction is used to introduce the nonlinear system stiffness to tune the resonant frequency to match the excitation. Since the three cantilever tips are all free and the magnetic force is distance dependent, the resonant frequencies will be complexly changed with the vertical vibration of the free end. Both model and experiment are built. The electromechanically coupled lumped-parameter model is presented. An electromechanical formulation and analytical expressions for the coupled nonlinear vibration response and voltage response are given. The entire structure is fabricated and mechanically attached to a electromagnetic shaker as a vibrating body via the fixed base, in order to couple the vibrations to the cantilever. The cantilevers are bonded with piezoelectric macro-fiber composite (MFC) materials (Model: M8514P2). The size of the cantilevers is 120*20mm2 and the thicknesses are separately 1mm, 0.8mm, 0.6mm. The prototype generator has a measured performance of 160.98 mW effective electrical power and 7.93 DC output voltage via the excitation level of 10m/s2. The 130% increase in the operating bandwidth is achieved. This device is promising to support low-power devices, peer-to-peer wireless nodes, and small-scale wireless sensor networks in ambient vibration environment.

Keywords: tri-cantilever, ambient vibration, energy harvesting, magnetic oscillator

Procedia PDF Downloads 142
1156 Developing Digital Competencies in Aboriginal Students through University-College Partnerships

Authors: W. S. Barber, S. L. King

Abstract:

This paper reports on a pilot project to develop a collaborative partnership between a community college in rural northern Ontario, Canada, and an urban university in the greater Toronto area in Oshawa, Canada. Partner institutions will collaborate to address learning needs of university applicants whose goals are to attain an undergraduate university BA in Educational Studies and Digital Technology degree, but who may not live in a geographical location that would facilitate this pathways process. The UOIT BA degree is attained through a 2+2 program, where students with a 2 year college diploma or equivalent can attain a four year undergraduate degree. The goals reported on the project are as: 1. Our aim is to expand the BA program to include an additional stream which includes serious educational games, simulations and virtual environments, 2. Develop fully (using both synchronous and asynchronous technologies) online learning modules for use by university applicants who otherwise are not geographically located close to a physical university site, 3. Assess the digital competencies of all students, including members of local, distance and Indigenous communities using a validated tool developed and tested by UOIT across numerous populations. This tool, the General Technical Competency Use and Scale (GTCU) will provide the collaborating institutions with data that will allow for analyzing how well students are prepared to succeed in fully online learning communities. Philosophically, the UOIT BA program is based on a fully online learning communities model (FOLC) that can be accessed from anywhere in the world through digital learning environments via audio video conferencing tools such as Adobe Connect. It also follows models of adult learning and mobile learning, and makes a university degree accessible to the increasing demographic of adult learners who may use mobile devices to learn anywhere anytime. The program is based on key principles of Problem Based Learning, allowing students to build their own understandings through the co-design of the learning environment in collaboration with the instructors and their peers. In this way, this degree allows students to personalize and individualize the learning based on their own culture, background and professional/personal experiences. Using modified flipped classroom strategies, students are able to interrogate video modules on their own time in preparation for one hour discussions occurring in video conferencing sessions. As a consequence of the program flexibility, students may continue to work full or part time. All of the partner institutions will co-develop four new modules, administer the GTCU and share data, while creating a new stream of the UOIT BA degree. This will increase accessibility for students to bridge from community colleges to university through a fully digital environment. We aim to work collaboratively with Indigenous elders, community members and distance education instructors to increase opportunities for more students to attain a university education.

Keywords: aboriginal, college, competencies, digital, universities

Procedia PDF Downloads 208
1155 Eco-Friendly Preservative Treated Bamboo Culm: Compressive Strength Analysis

Authors: Perminder JitKaur, Santosh Satya, K. K. Pant, S. N. Naik

Abstract:

Bamboo is extensively used in construction industry. Low durability of bamboo due to fungus infestation and termites attack under storage puts certain constrains for it usage as modern structural material. Looking at many chemical formulations for bamboo treatment leading to severe harmful environment effects, research on eco-friendly preservatives for bamboo treatment has been initiated world-over. In the present studies, eco-friendly preservative for bamboo treatment has been developed. To validate its application for structural purposes, investigation of effect of treatment on compressive strength has been investigated. Neem oil(25%) integrated with copper naphthenate (0.3%) on dilution with kerosene oil impregnated into bamboo culm at 2 bar pressure, has shown weight loss of only 3.15% in soil block analysis method. The results of compressive strength analysis using The results from compressive strength analysis using HEICO Automatic Compression Testing Machine, reveal that preservative treatment has not altered the structural properties of bamboo culms. Compressive strength of control (11.72 N/mm2) and above treated samples (11.71 N/mm2) was found to be comparable.

Keywords: D. strictus, bamboo, neem oil, presure treatment, compressive strength

Procedia PDF Downloads 391
1154 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 341
1153 Filmmaking with a Smartphone and National Cinema of Pakistan

Authors: Ahmad Bilal

Abstract:

Digital and convergent media can be helpful in terms of acquiring film production skills and knowledge, and it has also reduced the cost of production. Thus, allowing filmmakers greater opportunities and access to the medium of film. Both these dimensions of new and convergent media have been challenging the established cinema of Pakistan, as traditionally, it has been controlled by the authorities through censorship policies. The use of the smartphone as a movie camera, editing machine, and a transmitter can further challenge the control in a postcolonial society. To explore the impact of new and convergent media on the art of filmmaking, a film 'Sohni Dharti: An untrue story' is produced. It is shot both on a smartphone and a Digital Single Lens Reflex Camera (DSLR), with almost zero budgets. It is distributed through Vimeo from Pakistan. This process reveals how the technologies that are available today, and the increased knowledge of film production that they bring, allow a more inclusive experience of the film production and distribution. At the same time, however, it also discloses the limitations that accompany new technologies within the context of a postcolonial society. This paper will investigate the role of technology to bring filmmaking at a level of pencil and paper.

Keywords: convergent media, filmmaking, smartphone, Pakistan

Procedia PDF Downloads 263
1152 Agile Project Management: A Real Application in a Multi-Project Research and Development Center

Authors: Aysegul Sarac

Abstract:

The aim of this study is to analyze the impacts of integrating agile development principles and practices, in particular to reduce project lead time in a multi-project environment. We analyze Arçelik Washing Machine R&D Center in which multiple projects are conducted by shared resources. In the first part of the study, we illustrate the current waterfall model system by using a value stream map. We define all activities starting from the first idea of the project to the customer and measure process time and lead time of projects. In the second part of the study we estimate potential improvements and select a set of these improvements to integrate agile principles. We aim to develop a future state map and analyze the impacts of integrating lean principles on project lead time. The main contribution of this study is that we analyze and integrate agile product development principles in a real multi-project system.

Keywords: agile project management, multi project system, project lead time, product development

Procedia PDF Downloads 286
1151 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry

Authors: Dhanuj M. Gandikota

Abstract:

Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.

Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry

Procedia PDF Downloads 85
1150 Amharic Text News Classification Using Supervised Learning

Authors: Misrak Assefa

Abstract:

The Amharic language is the second most widely spoken Semitic language in the world. There are several new overloaded on the web. Searching some useful documents from the web on a specific topic, which is written in the Amharic language, is a challenging task. Hence, document categorization is required for managing and filtering important information. In the classification of Amharic text news, there is still a gap in the domain of information that needs to be launch. This study attempts to design an automatic Amharic news classification using a supervised learning mechanism on four un-touch classes. To achieve this research, 4,182 news articles were used. Naive Bayes (NB) and Decision tree (j48) algorithms were used to classify the given Amharic dataset. In this paper, k-fold cross-validation is used to estimate the accuracy of the classifier. As a result, it shows those algorithms can be applicable in Amharic news categorization. The best average accuracy result is achieved by j48 decision tree and naïve Bayes is 95.2345 %, and 94.6245 % respectively using three categories. This research indicated that a typical decision tree algorithm is more applicable to Amharic news categorization.

Keywords: text categorization, supervised machine learning, naive Bayes, decision tree

Procedia PDF Downloads 179