Search results for: image and signal processing
4855 Reasons for the Selection of Information-Processing Framework and the Philosophy of Mind as a General Account for an Error Analysis and Explanation on Mathematics
Authors: Michael Lousis
Abstract:
This research study is concerned with learner’s errors on Arithmetic and Algebra. The data resulted from a broader international comparative research program called Kassel Project. However, its conceptualisation differed from and contrasted with that of the main program, which was mostly based on socio-demographic data. The way in which the research study was conducted, was not dependent on the researcher’s discretion, but was absolutely dictated by the nature of the problem under investigation. This is because the phenomenon of learners’ mathematical errors is due neither to the intentions of learners nor to institutional processes, rules and norms, nor to the educators’ intentions and goals; but rather to the way certain information is presented to learners and how their cognitive apparatus processes this information. Several approaches for the study of learners’ errors have been developed from the beginning of the 20th century, encompassing different belief systems. These approaches were based on the behaviourist theory, on the Piagetian- constructivist research framework, the perspective that followed the philosophy of science and the information-processing paradigm. The researcher of the present study was forced to disclose the learners’ course of thinking that led them in specific observable actions with the result of showing particular errors in specific problems, rather than analysing scripts with the students’ thoughts presented in a written form. This, in turn, entailed that the choice of methods would have to be appropriate and conducive to seeing and realising the learners’ errors from the perspective of the participants in the investigation. This particular fact determined important decisions to be made concerning the selection of an appropriate framework for analysing the mathematical errors and giving explanations. Thus the rejection of the belief systems concerning behaviourism, the Piagetian-constructivist, and philosophy of science perspectives took place, and the information-processing paradigm in conjunction with the philosophy of mind were adopted as a general account for the elaboration of data. This paper explains why these decisions were appropriate and beneficial for conducting the present study and for the establishment of the ensued thesis. Additionally, the reasons for the adoption of the information-processing paradigm in conjunction with the philosophy of mind give sound and legitimate bases for the development of future studies concerning mathematical error analysis are explained.Keywords: advantages-disadvantages of theoretical prospects, behavioral prospect, critical evaluation of theoretical prospects, error analysis, information-processing paradigm, opting for the appropriate approach, philosophy of science prospect, Piagetian-constructivist research frameworks, review of research in mathematical errors
Procedia PDF Downloads 1904854 Reconstruction of Visual Stimuli Using Stable Diffusion with Text Conditioning
Authors: ShyamKrishna Kirithivasan, Shreyas Battula, Aditi Soori, Richa Ramesh, Ramamoorthy Srinath
Abstract:
The human brain, among the most complex and mysterious aspects of the body, harbors vast potential for extensive exploration. Unraveling these enigmas, especially within neural perception and cognition, delves into the realm of neural decoding. Harnessing advancements in generative AI, particularly in Visual Computing, seeks to elucidate how the brain comprehends visual stimuli observed by humans. The paper endeavors to reconstruct human-perceived visual stimuli using Functional Magnetic Resonance Imaging (fMRI). This fMRI data is then processed through pre-trained deep-learning models to recreate the stimuli. Introducing a new architecture named LatentNeuroNet, the aim is to achieve the utmost semantic fidelity in stimuli reconstruction. The approach employs a Latent Diffusion Model (LDM) - Stable Diffusion v1.5, emphasizing semantic accuracy and generating superior quality outputs. This addresses the limitations of prior methods, such as GANs, known for poor semantic performance and inherent instability. Text conditioning within the LDM's denoising process is handled by extracting text from the brain's ventral visual cortex region. This extracted text undergoes processing through a Bootstrapping Language-Image Pre-training (BLIP) encoder before it is injected into the denoising process. In conclusion, a successful architecture is developed that reconstructs the visual stimuli perceived and finally, this research provides us with enough evidence to identify the most influential regions of the brain responsible for cognition and perception.Keywords: BLIP, fMRI, latent diffusion model, neural perception.
Procedia PDF Downloads 694853 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics
Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir
Abstract:
Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone
Procedia PDF Downloads 1934852 Inviscid Steady Flow Simulation Around a Wing Configuration Using MB_CNS
Authors: Muhammad Umar Kiani, Muhammad Shahbaz, Hassan Akbar
Abstract:
Simulation of a high speed inviscid steady ideal air flow around a 2D/axial-symmetry body was carried out by the use of mb_cns code. mb_cns is a program for the time-integration of the Navier-Stokes equations for two-dimensional compressible flows on a multiple-block structured mesh. The flow geometry may be either planar or axisymmetric and multiply-connected domains can be modeled by patching together several blocks. The main simulation code is accompanied by a set of pre and post-processing programs. The pre-processing programs scriptit and mb_prep start with a short script describing the geometry, initial flow state and boundary conditions and produce a discretized version of the initial flow state. The main flow simulation program (or solver as it is sometimes called) is mb_cns. It takes the files prepared by scriptit and mb_prep, integrates the discrete form of the gas flow equations in time and writes the evolved flow data to a set of output files. This output data may consist of the flow state (over the whole domain) at a number of instants in time. After integration in time, the post-processing programs mb_post and mb_cont can be used to reformat the flow state data and produce GIF or postscript plots of flow quantities such as pressure, temperature and Mach number. The current problem is an example of supersonic inviscid flow. The flow domain for the current problem (strake configuration wing) is discretized by a structured grid and a finite-volume approach is used to discretize the conservation equations. The flow field is recorded as cell-average values at cell centers and explicit time stepping is used to update conserved quantities. MUSCL-type interpolation and one of three flux calculation methods (Riemann solver, AUSMDV flux splitting and the Equilibrium Flux Method, EFM) are used to calculate inviscid fluxes across cell faces.Keywords: steady flow simulation, processing programs, simulation code, inviscid flux
Procedia PDF Downloads 4294851 Improved Signal-To-Noise Ratio by the 3D-Functionalization of Fully Zwitterionic Surface Coatings
Authors: Esther Van Andel, Stefanie C. Lange, Maarten M. J. Smulders, Han Zuilhof
Abstract:
False outcomes of diagnostic tests are a major concern in medical health care. To improve the reliability of surface-based diagnostic tests, it is of crucial importance to diminish background signals that arise from the non-specific binding of biomolecules, a process called fouling. The aim is to create surfaces that repel all biomolecules except the molecule of interest. This can be achieved by incorporating antifouling protein repellent coatings in between the sensor surface and it’s recognition elements (e.g. antibodies, sugars, aptamers). Zwitterionic polymer brushes are considered excellent antifouling materials, however, to be able to bind the molecule of interest, the polymer brushes have to be functionalized and so far this was only achieved at the expense of either antifouling or binding capacity. To overcome this limitation, we combined both features into one single monomer: a zwitterionic sulfobetaine, ensuring antifouling capabilities, equipped with a clickable azide moiety which allows for further functionalization. By copolymerizing this monomer together with a standard sulfobetaine, the number of azides (and with that the number of recognition elements) can be tuned depending on the application. First, the clickable azido-monomer was synthesized and characterized, followed by copolymerizing this monomer to yield functionalizable antifouling brushes. The brushes were fully characterized using surface characterization techniques like XPS, contact angle measurements, G-ATR-FTIR and XRR. As a proof of principle, the brushes were subsequently functionalized with biotin via strain-promoted alkyne azide click reactions, which yielded a fully zwitterionic biotin-containing 3D-functionalized coating. The sensing capacity was evaluated by reflectometry using avidin and fibrinogen containing protein solutions. The surfaces showed excellent antifouling properties as illustrated by the complete absence of non-specific fibrinogen binding, while at the same time clear responses were seen for the specific binding of avidin. A great increase in signal-to-noise ratio was observed, even when the amount of functional groups was lowered to 1%, compared to traditional modification of sulfobetaine brushes that rely on a 2D-approach in which only the top-layer can be functionalized. This study was performed on stoichiometric silicon nitride surfaces for future microring resonator based assays, however, this methodology can be transferred to other biosensor platforms which are currently being investigated. The approach presented herein enables a highly efficient strategy for selective binding with retained antifouling properties for improved signal-to-noise ratios in binding assays. The number of recognition units can be adjusted to a specific need, e.g. depending on the size of the analyte to be bound, widening the scope of these functionalizable surface coatings.Keywords: antifouling, signal-to-noise ratio, surface functionalization, zwitterionic polymer brushes
Procedia PDF Downloads 3064850 Amplifying Sine Unit-Convolutional Neural Network: An Efficient Deep Architecture for Image Classification and Feature Visualizations
Authors: Jamshaid Ul Rahman, Faiza Makhdoom, Dianchen Lu
Abstract:
Activation functions play a decisive role in determining the capacity of Deep Neural Networks (DNNs) as they enable neural networks to capture inherent nonlinearities present in data fed to them. The prior research on activation functions primarily focused on the utility of monotonic or non-oscillatory functions, until Growing Cosine Unit (GCU) broke the taboo for a number of applications. In this paper, a Convolutional Neural Network (CNN) model named as ASU-CNN is proposed which utilizes recently designed activation function ASU across its layers. The effect of this non-monotonic and oscillatory function is inspected through feature map visualizations from different convolutional layers. The optimization of proposed network is offered by Adam with a fine-tuned adjustment of learning rate. The network achieved promising results on both training and testing data for the classification of CIFAR-10. The experimental results affirm the computational feasibility and efficacy of the proposed model for performing tasks related to the field of computer vision.Keywords: amplifying sine unit, activation function, convolutional neural networks, oscillatory activation, image classification, CIFAR-10
Procedia PDF Downloads 1114849 Development of a Feedback Control System for a Lab-Scale Biomass Combustion System Using Programmable Logic Controller
Authors: Samuel O. Alamu, Seong W. Lee, Blaise Kalmia, Marc J. Louise Caballes, Xuejun Qian
Abstract:
The application of combustion technologies for thermal conversion of biomass and solid wastes to energy has been a major solution to the effective handling of wastes over a long period of time. Lab-scale biomass combustion systems have been observed to be economically viable and socially acceptable, but major concerns are the environmental impacts of the process and deviation of temperature distribution within the combustion chamber. Both high and low combustion chamber temperature may affect the overall combustion efficiency and gaseous emissions. Therefore, there is an urgent need to develop a control system which measures the deviations of chamber temperature from set target values, sends these deviations (which generates disturbances in the system) in the form of feedback signal (as input), and control operating conditions for correcting the errors. In this research study, major components of the feedback control system were determined, assembled, and tested. In addition, control algorithms were developed to actuate operating conditions (e.g., air velocity, fuel feeding rate) using ladder logic functions embedded in the Programmable Logic Controller (PLC). The developed control algorithm having chamber temperature as a feedback signal is integrated into the lab-scale swirling fluidized bed combustor (SFBC) to investigate the temperature distribution at different heights of the combustion chamber based on various operating conditions. The air blower rates and the fuel feeding rates obtained from automatic control operations were correlated with manual inputs. There was no observable difference in the correlated results, thus indicating that the written PLC program functions were adequate in designing the experimental study of the lab-scale SFBC. The experimental results were analyzed to study the effect of air velocity operating at 222-273 ft/min and fuel feeding rate of 60-90 rpm on the chamber temperature. The developed temperature-based feedback control system was shown to be adequate in controlling the airflow and the fuel feeding rate for the overall biomass combustion process as it helps to minimize the steady-state error.Keywords: air flow, biomass combustion, feedback control signal, fuel feeding, ladder logic, programmable logic controller, temperature
Procedia PDF Downloads 1294848 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 2594847 Importance of Ethics in Cloud Security
Authors: Pallavi Malhotra
Abstract:
This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education
Procedia PDF Downloads 3254846 Modeling and Simulation of Fluid Catalytic Cracking Process
Authors: Sungho Kim, Dae Shik Kim, Jong Min Lee
Abstract:
Fluid catalytic cracking (FCC) process is one of the most important process in modern refinery industry. This paper focuses on the fluid catalytic cracking (FCC) process. As the FCC process is difficult to model well, due to its non linearities and various interactions between its process variables, rigorous process modeling of whole FCC plant is demanded for control and plant-wide optimization of the plant. In this study, a process design for the FCC plant includes riser reactor, main fractionator, and gas processing unit was developed. A reactor model was described based on four-lumped kinetic scheme. Main fractionator, gas processing unit and other process units are designed to simulate real plant data, using a process flow sheet simulator, Aspen PLUS. The custom reactor model was integrated with the process flow sheet simulator to develop an integrated process model.Keywords: fluid catalytic cracking, simulation, plant data, process design
Procedia PDF Downloads 5304845 Accuracy of a 3D-Printed Polymer Model for Producing Casting Mold
Authors: Ariangelo Hauer Dias Filho, Gustavo Antoniácomi de Carvalho, Benjamim de Melo Carvalho
Abstract:
The work´s purpose was to evaluate the possibility of manufacturing casting tools utilizing Fused Filament Fabrication, a 3D printing technique, without any post-processing on the printed part. Taguchi Orthogonal array was used to evaluate the influence of extrusion temperature, bed temperature, layer height, and infill on the dimensional accuracy of a 3D-Printed Polymer Model. A Zeiss T-SCAN CS 3D Scanner was used for dimensional evaluation of the printed parts within the limit of ±0,2 mm. The mold capabilities were tested with the printed model to check how it would interact with the green sand. With little adjustments in the 3D model, it was possible to produce rapid tools without the need for post-processing for iron casting. The results are important for reducing time and cost in the development of such tools.Keywords: additive manufacturing, Taguchi method, rapid tooling, fused filament fabrication, casting mold
Procedia PDF Downloads 1444844 Optimization of Waste Plastic to Fuel Oil Plants' Deployment Using Mixed Integer Programming
Authors: David Muyise
Abstract:
Mixed Integer Programming (MIP) is an approach that involves the optimization of a range of decision variables in order to minimize or maximize a particular objective function. The main objective of this study was to apply the MIP approach to optimize the deployment of waste plastic to fuel oil processing plants in Uganda. The processing plants are meant to reduce plastic pollution by pyrolyzing the waste plastic into a cleaner fuel that can be used to power diesel/paraffin engines, so as (1) to reduce the negative environmental impacts associated with plastic pollution and also (2) to curb down the energy gap by utilizing the fuel oil. A programming model was established and tested in two case study applications that are, small-scale applications in rural towns and large-scale deployment across major cities in the country. In order to design the supply chain, optimal decisions on the types of waste plastic to be processed, size, location and number of plants, and downstream fuel applications were concurrently made based on the payback period, investor requirements for capital cost and production cost of fuel and electricity. The model comprises qualitative data gathered from waste plastic pickers at landfills and potential investors, and quantitative data obtained from primary research. It was found out from the study that a distributed system is suitable for small rural towns, whereas a decentralized system is only suitable for big cities. Small towns of Kalagi, Mukono, Ishaka, and Jinja were found to be the ideal locations for the deployment of distributed processing systems, whereas Kampala, Mbarara, and Gulu cities were found to be the ideal locations initially utilize the decentralized pyrolysis technology system. We conclude that the model findings will be most important to investors, engineers, plant developers, and municipalities interested in waste plastic to fuel processing in Uganda and elsewhere in developing economy.Keywords: mixed integer programming, fuel oil plants, optimisation of waste plastics, plastic pollution, pyrolyzing
Procedia PDF Downloads 1294843 Avoiding Gas Hydrate Problems in Qatar Oil and Gas Industry: Environmentally Friendly Solvents for Gas Hydrate Inhibition
Authors: Nabila Mohamed, Santiago Aparicio, Bahman Tohidi, Mert Atilhan
Abstract:
Qatar's one of the biggest problem in processing its natural resource, which is natural gas, is the often occurring blockage in the pipelines caused due to uncontrolled gas hydrate formation in the pipelines. Several millions of dollars are being spent at the process site to dehydrate the blockage safely by using chemical inhibitors. We aim to establish national database, which addresses the physical conditions that promotes Qatari natural gas to form gas hydrates in the pipelines. Moreover, we aim to design and test novel hydrate inhibitors that are suitable for Qatari natural gas and its processing facilities. From these perspectives we are aiming to provide more effective and sustainable reservoir utilization and processing of Qatari natural gas. In this work, we present the initial findings of a QNRF funded project, which deals with the natural gas hydrate formation characteristics of Qatari type gas in both experimental (PVTx) and computational (molecular simulations) methods. We present the data from the two fully automated apparatus: a gas hydrate autoclave and a rocking cell. Hydrate equilibrium curves including growth/dissociation conditions for multi-component systems for several gas mixtures that represent Qatari type natural gas with and without the presence of well known kinetic and thermodynamic hydrate inhibitors. Ionic liquids were designed and used for testing their inhibition performance and their DFT and molecular modeling simulation results were also obtained and compared with the experimental results. Results showed significant performance of ionic liquids with up to 0.5 % in volume with up to 2 to 4 0C inhibition at high pressures.Keywords: gas hydrates, natural gas, ionic liquids, inhibition, thermodynamic inhibitors, kinetic inhibitors
Procedia PDF Downloads 13204842 Performance Comparison of Wideband Covariance Matrix Sparse Representation (W-CMSR) with Other Wideband DOA Estimation Methods
Authors: Sandeep Santosh, O. P. Sahu
Abstract:
In this paper, performance comparison of wideband covariance matrix sparse representation (W-CMSR) method with other existing wideband Direction of Arrival (DOA) estimation methods has been made.W-CMSR relies less on a priori information of the incident signal number than the ordinary subspace based methods.Consider the perturbation free covariance matrix of the wideband array output. The diagonal covariance elements are contaminated by unknown noise variance. The covariance matrix of array output is conjugate symmetric i.e its upper right triangular elements can be represented by lower left triangular ones.As the main diagonal elements are contaminated by unknown noise variance,slide over them and align the lower left triangular elements column by column to obtain a measurement vector.Simulation results for W-CMSR are compared with simulation results of other wideband DOA estimation methods like Coherent signal subspace method (CSSM), Capon, l1-SVD, and JLZA-DOA. W-CMSR separate two signals very clearly and CSSM, Capon, L1-SVD and JLZA-DOA fail to separate two signals clearly and an amount of pseudo peaks exist in the spectrum of L1-SVD.Keywords: W-CMSR, wideband direction of arrival (DOA), covariance matrix, electrical and computer engineering
Procedia PDF Downloads 4714841 Linguistic Analysis of Borderline Personality Disorder: Using Language to Predict Maladaptive Thoughts and Behaviours
Authors: Charlotte Entwistle, Ryan Boyd
Abstract:
Recent developments in information retrieval techniques and natural language processing have allowed for greater exploration of psychological and social processes. Linguistic analysis methods for understanding behaviour have provided useful insights within the field of mental health. One area within mental health that has received little attention though, is borderline personality disorder (BPD). BPD is a common mental health disorder characterised by instability of interpersonal relationships, self-image and affect. It also manifests through maladaptive behaviours, such as impulsivity and self-harm. Examination of language patterns associated with BPD could allow for a greater understanding of the disorder and its links to maladaptive thoughts and behaviours. Language analysis methods could also be used in a predictive way, such as by identifying indicators of BPD or predicting maladaptive thoughts, emotions and behaviours. Additionally, associations that are uncovered between language and maladaptive thoughts and behaviours could then be applied at a more general level. This study explores linguistic characteristics of BPD, and their links to maladaptive thoughts and behaviours, through the analysis of social media data. Data were collected from a large corpus of posts from the publicly available social media platform Reddit, namely, from the ‘r/BPD’ subreddit whereby people identify as having BPD. Data were collected using the Python Reddit API Wrapper and included all users which had posted within the BPD subreddit. All posts were manually inspected to ensure that they were not posted by someone who clearly did not have BPD, such as people posting about a loved one with BPD. These users were then tracked across all other subreddits of which they had posted in and data from these subreddits were also collected. Additionally, data were collected from a random control group of Reddit users. Disorder-relevant behaviours, such as self-harming or aggression-related behaviours, outlined within Reddit posts were coded to by expert raters. All posts and comments were aggregated by user and split by subreddit. Language data were then analysed using the Linguistic Inquiry and Word Count (LIWC) 2015 software. LIWC is a text analysis program that identifies and categorises words based on linguistic and paralinguistic dimensions, psychological constructs and personal concern categories. Statistical analyses of linguistic features could then be conducted. Findings revealed distinct linguistic features associated with BPD, based on Reddit posts, which differentiated these users from a control group. Language patterns were also found to be associated with the occurrence of maladaptive thoughts and behaviours. Thus, this study demonstrates that there are indeed linguistic markers of BPD present on social media. It also implies that language could be predictive of maladaptive thoughts and behaviours associated with BPD. These findings are of importance as they suggest potential for clinical interventions to be provided based on the language of people with BPD to try to reduce the likelihood of maladaptive thoughts and behaviours occurring. For example, by social media tracking or engaging people with BPD in expressive writing therapy. Overall, this study has provided a greater understanding of the disorder and how it manifests through language and behaviour.Keywords: behaviour analysis, borderline personality disorder, natural language processing, social media data
Procedia PDF Downloads 3494840 Precursors Signatures of Few Major Earthquakes in Italy Using Very Low Frequency Signal of 45.9kHz
Authors: Keshav Prasad Kandel, Balaram Khadka, Karan Bhatta, Basu Dev Ghimire
Abstract:
Earthquakes still exist as a threating disaster. Being able to predict earthquakes will certainly help prevent substantial loss of life and property. Perhaps, Very Low Frequency/Low Frequency (VLF/LF) signal band (3-30 kHz), which is effectively reflected from D-layer of ionosphere, can be established as a tool to predict earthquake. On May 20 and May 29, 2012, earthquakes of magnitude 6.1 and 5.8 respectively struck Emilia-Romagna of Italy. A year back, on August 24, 2016, an earthquake of magnitude 6.2 struck Central Italy (42.7060 N and 13.2230 E) at 1:36 UT. We present the results obtained from the US Navy VLF Transmitter’s NSY signal of 45.9 kHz transmitted from Niscemi, in the province of Sicily, Italy and received at the Kiel Longwave Monitor, Germany for 2012 and 2016. We analyzed the terminator times, their individual differences and nighttime fluctuation counts. We also analyzed trends, dispersion and nighttime fluctuation which gave us a possible precursors to these earthquakes. Since perturbations in VLF amplitude could also be due to various other factors like lightning, geomagnetic activities (storms, auroras etc.) and solar activities (flares, UV flux, etc.), we filtered the possible perturbations due to these agents to guarantee that the perturbations seen in VLF/LF amplitudes were as a precursor to Earthquakes. As our TRGCP path is North-south, the sunrise and sunset time in transmitter and receiver places matches making pathway for VLF/LF smoother and therefore hoping to obtain more natural data. To our surprise, we found many clear anomalies (as precursors) in terminator times 5 days to 16 days before the earthquakes. Moreover, using night time fluctuation method, we found clear anomalies 5 days to 13 days prior to main earthquakes. This exactly correlates with the findings of previous authors that ionospheric perturbations are seen few days to one month before the seismic activity. In addition to this, we were amazed to observe unexpected decrease of dispersion on certain anomalies where it was supposed to increase, thereby not supporting our finding to some extent. To resolve this problem, we devised a new parameter called dispersion nighttime (dispersion). On analyzing, this parameter decreases significantly on days of nighttime anomalies thereby supporting our precursors to much extent.Keywords: D-layer, TRGCP (Transmitter Receiver Great Circle Path), terminator times, VLF/LF
Procedia PDF Downloads 1914839 Magnetic Survey for the Delineation of Concrete Pillars in Geotechnical Investigation for Site Characterization
Authors: Nuraddeen Usman, Khiruddin Abdullah, Mohd Nawawi, Amin Khalil Ismail
Abstract:
A magnetic survey is carried out in order to locate the remains of construction items, specifically concrete pillars. The conventional Euler deconvolution technique can perform the task but it requires the use of fixed structural index (SI) and the construction items are made of materials with different shapes which require different SI (unknown). A Euler deconvolution technique that estimate background, horizontal coordinate (xo and yo), depth and structural index (SI) simultaneously is prepared and used for this task. The synthetic model study carried indicated the new methodology can give a good estimate of location and does not depend on magnetic latitude. For field data, both the total magnetic field and gradiometer reading had been collected simultaneously. The computed vertical derivatives and gradiometer readings are compared and they have shown good correlation signifying the effectiveness of the method. The filtering is carried out using automated procedure, analytic signal and other traditional techniques. The clustered depth solutions coincided with the high amplitude/values of analytic signal and these are the possible target positions of the concrete pillars being sought. The targets under investigation are interpreted to be located at the depth between 2.8 to 9.4 meters. More follow up survey is recommended as this mark the preliminary stage of the work.Keywords: concrete pillar, magnetic survey, geotechnical investigation, Euler Deconvolution
Procedia PDF Downloads 2584838 FracXpert: Ensemble Machine Learning Approach for Localization and Classification of Bone Fractures in Cricket Athletes
Authors: Madushani Rodrigo, Banuka Athuraliya
Abstract:
In today's world of medical diagnosis and prediction, machine learning stands out as a strong tool, transforming old ways of caring for health. This study analyzes the use of machine learning in the specialized domain of sports medicine, with a focus on the timely and accurate detection of bone fractures in cricket athletes. Failure to identify bone fractures in real time can result in malunion or non-union conditions. To ensure proper treatment and enhance the bone healing process, accurately identifying fracture locations and types is necessary. When interpreting X-ray images, it relies on the expertise and experience of medical professionals in the identification process. Sometimes, radiographic images are of low quality, leading to potential issues. Therefore, it is necessary to have a proper approach to accurately localize and classify fractures in real time. The research has revealed that the optimal approach needs to address the stated problem and employ appropriate radiographic image processing techniques and object detection algorithms. These algorithms should effectively localize and accurately classify all types of fractures with high precision and in a timely manner. In order to overcome the challenges of misidentifying fractures, a distinct model for fracture localization and classification has been implemented. The research also incorporates radiographic image enhancement and preprocessing techniques to overcome the limitations posed by low-quality images. A classification ensemble model has been implemented using ResNet18 and VGG16. In parallel, a fracture segmentation model has been implemented using the enhanced U-Net architecture. Combining the results of these two implemented models, the FracXpert system can accurately localize exact fracture locations along with fracture types from the available 12 different types of fracture patterns, which include avulsion, comminuted, compressed, dislocation, greenstick, hairline, impacted, intraarticular, longitudinal, oblique, pathological, and spiral. This system will generate a confidence score level indicating the degree of confidence in the predicted result. Using ResNet18 and VGG16 architectures, the implemented fracture segmentation model, based on the U-Net architecture, achieved a high accuracy level of 99.94%, demonstrating its precision in identifying fracture locations. Simultaneously, the classification ensemble model achieved an accuracy of 81.0%, showcasing its ability to categorize various fracture patterns, which is instrumental in the fracture treatment process. In conclusion, FracXpert has become a promising ML application in sports medicine, demonstrating its potential to revolutionize fracture detection processes. By leveraging the power of ML algorithms, this study contributes to the advancement of diagnostic capabilities in cricket athlete healthcare, ensuring timely and accurate identification of bone fractures for the best treatment outcomes.Keywords: multiclass classification, object detection, ResNet18, U-Net, VGG16
Procedia PDF Downloads 1214837 Dematerialized Beings in Katherine Dunn's Geek Love: A Corporeal and Ethical Study under Posthumanities
Authors: Anum Javed
Abstract:
This study identifies the dynamical image of human body that continues its metamorphosis in the virtual field of reality. It calls attention to the ways where humans start co-evolving with other life forms; technology in particular and are striving to establish a realm outside the physical framework of matter. The problem exceeds the area of technological ethics by explicably and explanatorily entering the space of literary texts and criticism. Textual analysis of Geek Love (1989) by Katherine Dunn is adjoined with posthumanist perspectives of Pramod K. Nayar to beget psycho-somatic changes in man’s nature of being. It uncovers the meaning people give to their experiences in this budding social and cultural phenomena of material representation tied up with personal practices and technological innovations. It also observes an ethical, physical and psychological reassessment of man within the context of technological evolutions. The study indicates the elements that have rendered morphological freedom and new materialism in man’s consciousness. Moreover this work is inquisitive of what it means to be a human in this time of accelerating change where surgeries, implants, extensions, cloning and robotics have shaped a new sense of being. It attempts to go beyond individual’s body image and explores how objectifying media and culture have influenced people’s judgement of others on new material grounds. It further argues a decentring of the glorified image of man as an independent entity because of his energetic partnership with intelligent machines and external agents. The history of the future progress of technology is also mentioned. The methodology adopted is posthumanist techno-ethical textual analysis. This work necessitates a negotiating relationship between man and technology in order to achieve harmonic and balanced interconnected existence. The study concludes by recommending a call for an ethical set of codes to be cultivated for the techno-human habituation. Posthumanism ushers a strong need of adopting new ethics within the terminology of neo-materialist humanism.Keywords: corporeality, dematerialism, human ethos, posthumanism
Procedia PDF Downloads 1474836 Signal Amplification Using Graphene Oxide in Label Free Biosensor for Pathogen Detection
Authors: Agampodi Promoda Perera, Yong Shin, Mi Kyoung Park
Abstract:
The successful detection of pathogenic bacteria in blood provides important information for early detection, diagnosis and the prevention and treatment of infectious diseases. Silicon microring resonators are refractive-index-based optical biosensors that provide highly sensitive, label-free, real-time multiplexed detection of biomolecules. We demonstrate the technique of using GO (graphene oxide) to enhance the signal output of the silicon microring optical sensor. The activated carboxylic groups in GO molecules bind directly to single stranded DNA with an amino modified 5’ end. This conjugation amplifies the shift in resonant wavelength in a real-time manner. We designed a capture probe for strain Staphylococcus aureus of 21 bp and a longer complementary target sequence of 70 bp. The mismatched target sequence we used was of Streptococcus agalactiae of 70 bp. GO is added after the complementary binding of the probe and target. GO conjugates to the unbound single stranded segment of the target and increase the wavelength shift on the silicon microring resonator. Furthermore, our results show that GO could successfully differentiate between the mismatched DNA sequences from the complementary DNA sequence. Therefore, the proposed concept could effectively enhance sensitivity of pathogen detection sensors.Keywords: label free biosensor, pathogenic bacteria, graphene oxide, diagnosis
Procedia PDF Downloads 4684835 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)
Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves
Abstract:
The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.Keywords: 3D models, environment, matching, pleiades
Procedia PDF Downloads 3304834 Nonlinear Analysis in Investigating the Complexity of Neurophysiological Data during Reflex Behavior
Authors: Juliana A. Knocikova
Abstract:
Methods of nonlinear signal analysis are based on finding that random behavior can arise in deterministic nonlinear systems with a few degrees of freedom. Considering the dynamical systems, entropy is usually understood as a rate of information production. Changes in temporal dynamics of physiological data are indicating evolving of system in time, thus a level of new signal pattern generation. During last decades, many algorithms were introduced to assess some patterns of physiological responses to external stimulus. However, the reflex responses are usually characterized by short periods of time. This characteristic represents a great limitation for usual methods of nonlinear analysis. To solve the problems of short recordings, parameter of approximate entropy has been introduced as a measure of system complexity. Low value of this parameter is reflecting regularity and predictability in analyzed time series. On the other side, increasing of this parameter means unpredictability and a random behavior, hence a higher system complexity. Reduced neurophysiological data complexity has been observed repeatedly when analyzing electroneurogram and electromyogram activities during defence reflex responses. Quantitative phrenic neurogram changes are also obvious during severe hypoxia, as well as during airway reflex episodes. Concluding, the approximate entropy parameter serves as a convenient tool for analysis of reflex behavior characterized by short lasting time series.Keywords: approximate entropy, neurophysiological data, nonlinear dynamics, reflex
Procedia PDF Downloads 3004833 Implementation of Congestion Management Strategies on Arterial Roads: Case Study of Geelong
Authors: A. Das, L. Hitihamillage, S. Moridpour
Abstract:
Natural disasters are inevitable to the biodiversity. Disasters such as flood, tsunami and tornadoes could be brutal, harsh and devastating. In Australia, flooding is a major issue experienced by different parts of the country. In such crisis, delays in evacuation could decide the life and death of the people living in those regions. Congestion management could become a mammoth task if there are no steps taken before such situations. In the past to manage congestion in such circumstances, many strategies were utilised such as converting the road shoulders to extra lanes or changing the road geometry by adding more lanes. However, expansion of road to resolving congestion problems is not considered a viable option nowadays. The authorities avoid this option due to many reasons, such as lack of financial support and land space. They tend to focus their attention on optimising the current resources they possess and use traffic signals to overcome congestion problems. Traffic Signal Management strategy was considered a viable option, to alleviate congestion problems in the City of Geelong, Victoria. Arterial road with signalised intersections considered in this paper and the traffic data required for modelling collected from VicRoads. Traffic signalling software SIDRA used to model the roads, and the information gathered from VicRoads. In this paper, various signal parameters utilised to assess and improve the corridor performance to achieve the best possible Level of Services (LOS) for the arterial road.Keywords: congestion, constraints, management, LOS
Procedia PDF Downloads 3984832 Internet Memes: A Mirror of Culture and Society
Authors: Alexandra-Monica Toma
Abstract:
As the internet became a ruling force of society, computer-mediated communication has enriched its methods to convey meaning by combining linguistic means to visual means of expressivity. One of the elements of cyberspace is what we call a meme, a succinct, visually engaging tool used to communicate ideas or emotions, usually in a funny or ironic manner. Coined by Richard Dawkings in the late 1970s to refer to cultural genes, this term now denominates a special type of vernacular language used to share content on the internet. This research aims to analyse the basic mechanism that stands at the basis of meme creation as a blend of innovation and imitation and will approach some of the most widely used image macros remixed to generate new content while also pointing out success strategies. Moreover, this paper discusses whether memes can transcend the light-hearted and playful mood they mirror and become biting and sharp cultural comments. The study also uses the concept of multimodality and stresses how the text interacts with image, discussing three types of relations between the two: symmetry, amplification, and contradiction. We will furthermore show that memes are cultural artifacts and virtual tropes highly dependent on context and societal issues by using a corpus of memes created related to the COVID-19 pandemic.Keywords: context, computer-mediated communication, memes, multimodality
Procedia PDF Downloads 1844831 Topic-to-Essay Generation with Event Element Constraints
Authors: Yufen Qin
Abstract:
Topic-to-Essay generation is a challenging task in Natural language processing, which aims to generate novel, diverse, and topic-related text based on user input. Previous research has overlooked the generation of articles under the constraints of event elements, resulting in issues such as incomplete event elements and logical inconsistencies in the generated results. To fill this gap, this paper proposes an event-constrained approach for a topic-to-essay generation that enforces the completeness of event elements during the generation process. Additionally, a language model is employed to verify the logical consistency of the generated results. Experimental results demonstrate that the proposed model achieves a better BLEU-2 score and performs better than the baseline in terms of subjective evaluation on a real dataset, indicating its capability to generate higher-quality topic-related text.Keywords: event element, language model, natural language processing, topic-to-essay generation.
Procedia PDF Downloads 2364830 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis
Authors: Shriya Shukla, Lachin Fernando
Abstract:
Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning
Procedia PDF Downloads 1264829 A Study on the Improvement of Mobile Device Call Buzz Noise Caused by Audio Frequency Ground Bounce
Authors: Jangje Park, So Young Kim
Abstract:
The market demand for audio quality in mobile devices continues to increase, and audible buzz noise generated in time division communication is a chronic problem that goes against the market demand. In the case of time division type communication, the RF Power Amplifier (RF PA) is driven at the audio frequency cycle, and it makes various influences on the audio signal. In this paper, we measured the ground bounce noise generated by the peak current flowing through the ground network in the RF PA with the audio frequency; it was confirmed that the noise is the cause of the audible buzz noise during a call. In addition, a grounding method of the microphone device that can improve the buzzing noise was proposed. Considering that the level of the audio signal generated by the microphone device is -38dBV based on 94dB Sound Pressure Level (SPL), even ground bounce noise of several hundred uV will fall within the range of audible noise if it is induced by the audio amplifier. Through the grounding method of the microphone device proposed in this paper, it was confirmed that the audible buzz noise power density at the RF PA driving frequency was improved by more than 5dB under the conditions of the Printed Circuit Board (PCB) used in the experiment. A fundamental improvement method was presented regarding the buzzing noise during a mobile phone call.Keywords: audio frequency, buzz noise, ground bounce, microphone grounding
Procedia PDF Downloads 1364828 The Effects of Physical Activity and Serotonin on Depression, Anxiety, Body Image and Mental Health
Authors: Sh. Khoshemehry, M. E. Bahram, M. J. Pourvaghar
Abstract:
Sport has found a special place as an effective phenomenon in all societies of the contemporary world. The relationship between physical activity and exercise with different sciences has provided new fields for human study. The range of issues related to exercise and physical education is such that it requires specialized sciences and special studies. In this article, the psychological and social sections of exercise have been investigated for children and adults. It can be used for anyone in different age groups. Exercise and regular physical movements have a great impact on the mental and social health of the individual in addition to body health. It affects the individual's adaptability in society and his/her personality. Exercise affects the treatment of diseases such as depression, anxiety, stress, body image, and memory. Exercise is a safe haven for young people to achieve the optimum human development in its shelter. The effects of sensorimotor skills on mental actions and mental development are such a way that many psychologists and sports science experts believe these activities should be included in training programs in the first place. Familiarity of students and scholars with different programs and methods of sensorimotor activities not only causes their mental actions; but also increases mental health and vitality, enhances self-confidence and, therefore, mental health.Keywords: anxiety, mental health, physical activity, serotonin
Procedia PDF Downloads 2074827 The Processing of Implicit Stereotypes in Everyday Scene Perception
Authors: Magali Mari, Fabrice Clement
Abstract:
The present study investigated the influence of implicit stereotypes on adults’ visual information processing, using an eye-tracking device. Implicit stereotyping is an automatic and implicit process; it happens relatively quickly, outside of awareness. In the presence of a member of a social group, a set of expectations about the characteristics of this social group appears automatically in people’s minds. The study aimed to shed light on the cognitive processes involved in stereotyping and to further investigate the use of eye movements to measure implicit stereotypes. With an eye-tracking device, the eye movements of participants were analyzed, while they viewed everyday scenes depicting women and men in congruent or incongruent gender role activities (e.g., a woman ironing or a man ironing). The settings of these scenes had to be analyzed to infer the character’s role. Also, participants completed an implicit association test that combined the concept of gender with attributes of occupation (home/work), while measuring reaction times to assess participants’ implicit stereotypes about gender. The results showed that implicit stereotypes do influence people’s visual attention; within a fraction of a second, the number of returns, between stereotypical and counter-stereotypical scenes, differed significantly, meaning that participants interpreted the scene itself as a whole before identifying the character. They predicted that, in such a situation, the character was supposed to be a woman or a man. Also, the study showed that eye movements could be used as a fast and reliable supplement for traditional implicit association tests to measure implicit stereotypes. Altogether, this research provides further understanding of implicit stereotypes processing as well as a natural method to study implicit stereotypes.Keywords: eye-tracking, implicit stereotypes, social cognition, visual attention
Procedia PDF Downloads 1594826 Integrated Model for Enhancing Data Security Processing Time in Cloud Computing
Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali
Abstract:
Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a simple user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.Keywords: cloud computing, data security, SAAS, PAAS, IAAS, Blowfish
Procedia PDF Downloads 359