Search results for: radiation processing
3546 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory
Authors: Xu Jiaqiao
Abstract:
Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments
Procedia PDF Downloads 983545 Waveguiding in an InAs Quantum Dots Nanomaterial for Scintillation Applications
Authors: Katherine Dropiewski, Michael Yakimov, Vadim Tokranov, Allan Minns, Pavel Murat, Serge Oktyabrsky
Abstract:
InAs Quantum Dots (QDs) in a GaAs matrix is a well-documented luminescent material with high light yield, as well as thermal and ionizing radiation tolerance due to quantum confinement. These benefits can be leveraged for high-efficiency, room temperature scintillation detectors. The proposed scintillator is composed of InAs QDs acting as luminescence centers in a GaAs stopping medium, which also acts as a waveguide. This system has appealing potential properties, including high light yield (~240,000 photons/MeV) and fast capture of photoelectrons (2-5ps), orders of magnitude better than currently used inorganic scintillators, such as LYSO or BaF2. The high refractive index of the GaAs matrix (n=3.4) ensures light emitted by the QDs is waveguided, which can be collected by an integrated photodiode (PD). Scintillation structures were grown using Molecular Beam Epitaxy (MBE) and consist of thick GaAs waveguiding layers with embedded sheets of modulation p-type doped InAs QDs. An AlAs sacrificial layer is grown between the waveguide and the GaAs substrate for epitaxial lift-off to separate the scintillator film and transfer it to a low-index substrate for waveguiding measurements. One consideration when using a low-density material like GaAs (~5.32 g/cm³) as a stopping medium is the matrix thickness in the dimension of radiation collection. Therefore, luminescence properties of very thick (4-20 microns) waveguides with up to 100 QD layers were studied. The optimization of the medium included QD shape, density, doping, and AlGaAs barriers at the waveguide surfaces to prevent non-radiative recombination. To characterize the efficiency of QD luminescence, low temperature photoluminescence (PL) (77-450 K) was measured and fitted using a kinetic model. The PL intensity degrades by only 40% at RT, with an activation energy for electron escape from QDs to the barrier of ~60 meV. Attenuation within the waveguide (WG) is a limiting factor for the lateral size of a scintillation detector, so PL spectroscopy in the waveguiding configuration was studied. Spectra were measured while the laser (630 nm) excitation point was scanned away from the collecting fiber coupled to the edge of the WG. The QD ground state PL peak at 1.04 eV (1190 nm) was inhomogeneously broadened with FWHM of 28 meV (33 nm) and showed a distinct red-shift due to self-absorption in the QDs. Attenuation stabilized after traveling over 1 mm through the WG, at about 3 cm⁻¹. Finally, a scintillator sample was used to test detection and evaluate timing characteristics using 5.5 MeV alpha particles. With a 2D waveguide and a small area of integrated PD, the collected charge averaged 8.4 x10⁴ electrons, corresponding to a collection efficiency of about 7%. The scintillation response had 80 ps noise-limited time resolution and a QD decay time of 0.6 ns. The data confirms unique properties of this scintillation detector which can be potentially much faster than any currently used inorganic scintillator.Keywords: GaAs, InAs, molecular beam epitaxy, quantum dots, III-V semiconductor
Procedia PDF Downloads 2573544 Determination of the Thermally Comfortable Air Temperature with Consideration of Individual Clothing and Activity as Preparation for a New Smart Home Heating System
Authors: Alexander Peikos, Carole Binsfeld
Abstract:
The aim of this paper is to determine a thermally comfortable air temperature in an automated living room. This calculated temperature should serve as input for a user-specific and dynamic heating control in such a living space. In addition to the usual physical factors (air temperature, humidity, air velocity, and radiation temperature), individual clothing and activity should be taken into account. The calculation of such a temperature is based on different methods and indices which are usually used for the evaluation of the thermal comfort. The thermal insulation of the worn clothing is determined with a Radio Frequency Identification system. The activity performed is only taken into account indirectly through the generated heart rate. All these methods are ultimately very well suited for use in temperature regulation in an automated home, but still require further research and extensive evaluation.Keywords: smart home, thermal comfort, predicted mean vote, radio frequency identification
Procedia PDF Downloads 1623543 Development of a Non-Dispersive Infrared Multi Gas Analyzer for a TMS
Authors: T. V. Dinh, I. Y. Choi, J. W. Ahn, Y. H. Oh, G. Bo, J. Y. Lee, J. C. Kim
Abstract:
A Non-Dispersive Infrared (NDIR) multi-gas analyzer has been developed to monitor the emission of carbon monoxide (CO) and sulfur dioxide (SO2) from various industries. The NDIR technique for gas measurement is based on the wavelength absorption in the infrared spectrum as a way to detect particular gasses. NDIR analyzers have popularly applied in the Tele-Monitoring System (TMS). The advantage of the NDIR analyzer is low energy consumption and cost compared with other spectroscopy methods. However, zero/span drift and interference are its urgent issues to be solved. Multi-pathway technique based on optical White cell was employed to improve the sensitivity of the analyzer in this work. A pyroelectric detector was used to detect the Infrared radiation. The analytical range of the analyzer was 0 ~ 200 ppm. The instrument response time was < 2 min. The detection limits of CO and SO2 were < 4 ppm and < 6 ppm, respectively. The zero and span drift of 24 h was less than 3%. The linearity of the analyzer was less than 2.5% of reference values. The precision and accuracy of both CO and SO2 channels were < 2.5% of relative standard deviation. In general, the analyzer performed well. However, the detection limit and 24h drift should be improved to be a more competitive instrument.Keywords: analyzer, CEMS, monitoring, NDIR, TMS
Procedia PDF Downloads 2613542 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1113541 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases
Authors: Mohammad A. Bani-Khaled
Abstract:
In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams
Procedia PDF Downloads 4213540 Synthesis of Bismuth-Hyaluronic Acid Nanoparticles Containing Melittin Coated with Chitosan for Treating Eye Cancer Cells with Radiotherapy
Authors: Akbar Esmaeili, Fateme Dadashi
Abstract:
Bismuth can increase radiation and reduce the dose of radiotherapy. On the other hand, hyaluronic acid plays a role in healing damaged cells, and melittin has been used to destroy cancer cells. This research aims to destroy eye cancer cells and accelerate the recovery of damaged healthy cells during treatment. In this research, we used this nanoparticle, the sol-gel method. According to the optimization process that was carried out, we obtained the optimal value of the desired variables for the manufacture of nanoparticles. The advantage of doing this is reducing the amount of medicine used, as a result of reducing the number of side effects during the treatment and using melittin as an anti-eye cancer drug and the presence of hyaluronic acid to accelerate the recovery of cells, as well as coating the bismuth nanoparticle with chitosan to increase the half-life of the nanoparticle and prevent its adhesion.Keywords: synthesis, nanoparticles, coated, cancer
Procedia PDF Downloads 733539 Automatic Classification of Lung Diseases from CT Images
Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari
Abstract:
Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification
Procedia PDF Downloads 1633538 MXene-Based Self-Sensing of Damage in Fiber Composites
Authors: Latha Nataraj, Todd Henry, Micheal Wallock, Asha Hall, Christine Hatter, Babak Anasori, Yury Gogotsi
Abstract:
Multifunctional composites with enhanced strength and toughness for superior damage tolerance are essential for advanced aerospace and military applications. Detection of structural changes prior to visible damage may be achieved by incorporating fillers with tunable properties such as two-dimensional (2D) nanomaterials with high aspect ratios and more surface-active sites. While 2D graphene with large surface areas, good mechanical properties, and high electrical conductivity seems ideal as a filler, the single-atomic thickness can lead to bending and rolling during processing, requiring post-processing to bond to polymer matrices. Lately, an emerging family of 2D transition metal carbides and nitrides, MXenes, has attracted much attention since their discovery in 2011. Metallic electronic conductivity and good mechanical properties, even with increased polymer content, coupled with hydrophilicity make MXenes a good candidate as a filler material in polymer composites and exceptional as multifunctional damage indicators in composites. Here, we systematically study MXene-based (Ti₃C₂) coated on glass fibers for fiber reinforced polymer composite for self-sensing using microscopy and micromechanical testing. Further testing is in progress through the investigation of local variations in optical, acoustic, and thermal properties within the damage sites in response to strain caused by mechanical loading.Keywords: damage sensing, fiber composites, MXene, self-sensing
Procedia PDF Downloads 1233537 Mobile Augmented Reality for Collaboration in Operation
Authors: Chong-Yang Qiao
Abstract:
Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.Keywords: mobile augmented reality, remote collaboration, user experience, cognition model
Procedia PDF Downloads 1983536 Motion of a Dust Grain Type Particle in Binary Stellar Systems
Authors: Rajib Mia, Badam Singh Kushvah
Abstract:
In this present paper, we use the photogravitational version of the restricted three body problem (RTBP) in binary systems. In the photogravitational RTBP, an infinitesimal particle (dust grain) is moving under the gravitational attraction and radiation pressure from the two bigger primaries. The third particle does not affect the motion of two bigger primaries. The zero-velocity curves, zero-velocity surfaces and their projections on the plane are studied. We have used existing analytical method to solve the equations of motion. We have obtained the Lagrangian points in some binary stellar systems. It is found that mass reduction factor affects the Lagrangian points. The linear stability of Lagrangian points is studied and found that these points are unstable. Moreover, trajectories of the infinitesimal particle at the triangular points are studied.Keywords: binary systems, Lagrangian points, linear stability, photogravitational RTBP, trajectories
Procedia PDF Downloads 2593535 A 3D Bioprinting System for Engineering Cell-Embedded Hydrogels by Digital Light Processing
Authors: Jimmy Jiun-Ming Su, Yuan-Min Lin
Abstract:
Bioprinting has been applied to produce 3D cellular constructs for tissue engineering. Microextrusion printing is the most common used method. However, printing low viscosity bioink is a challenge for this method. Herein, we developed a new 3D printing system to fabricate cell-laden hydrogels via a DLP-based projector. The bioprinter is assembled from affordable equipment including a stepper motor, screw, LED-based DLP projector, open source computer hardware and software. The system can use low viscosity and photo-polymerized bioink to fabricate 3D tissue mimics in a layer-by-layer manner. In this study, we used gelatin methylacrylate (GelMA) as bioink for stem cell encapsulation. In order to reinforce the printed construct, surface modified hydroxyapatite has been added in the bioink. We demonstrated the silanization of hydroxyapatite could improve the crosslinking between the interface of hydroxyapatite and GelMA. The results showed that the incorporation of silanized hydroxyapatite into the bioink had an enhancing effect on the mechanical properties of printed hydrogel, in addition, the hydrogel had low cytotoxicity and promoted the differentiation of embedded human bone marrow stem cells (hBMSCs) and retinal pigment epithelium (RPE) cells. Moreover, this bioprinting system has the ability to generate microchannels inside the engineered tissues to facilitate diffusion of nutrients. We believe this 3D bioprinting system has potential to fabricate various tissues for clinical applications and regenerative medicine in the future.Keywords: bioprinting, cell encapsulation, digital light processing, GelMA hydrogel
Procedia PDF Downloads 1853534 Development of Mesoporous Gel Based Nonwoven Structure for Thermal Barrier Application
Authors: R. P. Naik, A. K. Rakshit
Abstract:
In recent years, with the rapid development in science and technology, people have increasing requirements on uses of clothing for new functions, which contributes to opportunities for further development and incorporation of new technologies along with novel materials. In this context, textiles are of fast decalescence or fast heat radiation media as per as comfort accountability of textile articles are concern. The microstructure and texture of textiles play a vital role in determining the heat-moisture comfort level of the human body because clothing serves as a barrier to the outside environment and a transporter of heat and moisture from the body to the surrounding environment to keep thermal balance between body heat produced and body heat loss. The main bottleneck which is associated with textile materials to be successful as thermal insulation materials can be enumerated as; firstly, high loft or bulkiness of material so as to provide predetermined amount of insulation by ensuring sufficient trapping of air. Secondly, the insulation depends on forced convection; such convective heat loss cannot be prevented by textile material. Third is that the textile alone cannot reach the level of thermal conductivity lower than 0.025 W/ m.k of air. Perhaps, nano-fibers can do so, but still, mass production and cost-effectiveness is a problem. Finally, such high loft materials for thermal insulation becomes heavier and uneasy to manage especially when required to carry over a body. The proposed works aim at developing lightweight effective thermal insulation textiles in combination with nanoporous silica-gel which provides the fundamental basis for the optimization of material properties to achieve good performance of the clothing system. This flexible nonwoven silica-gel composites fabric in intact monolith was successfully developed by reinforcing SiO2-gel in thermal bonded nonwoven fabric via sol-gel processing. Ambient Pressure Drying method is opted for silica gel preparation for cost-effective manufacturing. The formed structure of the nonwoven / SiO₂ -gel composites were analyzed, and the transfer properties were measured. The effects of structure and fibre on the thermal properties of the SiO₂-gel composites were evaluated. Samples are then tested against untreated samples of same GSM in order to study the effect of SiO₂-gel application on various properties of nonwoven fabric. The nonwoven fabric composites reinforced with aerogel showed intact monolith structure were also analyzed for their surface structure, functional group present, microscopic images. Developed product reveals a significant reduction in pores' size and air permeability than the conventional nonwoven fabric. Composite made from polyester fibre with lower GSM shows lowest thermal conductivity. Results obtained were statistically analyzed by using STATISTICA-6 software for their level of significance. Univariate tests of significance for various parameters are practiced which gives the P value for analyzing significance level along with that regression summary for dependent variable are also studied to obtain correlation coefficient.Keywords: silica-gel, heat insulation, nonwoven fabric, thermal barrier clothing
Procedia PDF Downloads 1143533 Investigation Of The Catalyst's Effect On Nickel Sulfide Thin Films
Authors: Randa Slatnia
Abstract:
In this study, the nanostructured stable phase identification elaborated by nickel nitrate hyxahydrate and thiourea compounds. After the preparation of the solution (Stirred mixture with methanol as solvent), a deposition of eight layers of this solution on a glass substrate and annealed at 300 °C for energy applications. The annealed sample was analyzed by X-ray Grazing incidence diffraction (GID) with a Bruker D8 Advance diffractometer using Cu Kα1 radiation at 40 kV and 40 mA (1600 W) and Scanning electron microscopy (Thermo Fisher environmental SEM). The results of XRD-GID analysis for the prepared sample showed the formation of an identified stable phase NiS2 and the XRD-GID pattern of the elaborated sample with eight layers prepared solution and annealed show wide and characteristic peaks of the NiS2 with cubic structure (ICDD card no. PDF 01-078-4702). The morphology of the NiS2 thin films confirmed by XRD-GID analysis was investigated by ESEM showed a surface with a uniform and homogeneous distribution nanostructure.Keywords: nickel sulfide, thin films, XRD, ESEM
Procedia PDF Downloads 893532 The Photon-Drag Effect in Cylindrical Quantum Wire with a Parabolic Potential
Authors: Hoang Van Ngoc, Nguyen Thu Huong, Nguyen Quang Bau
Abstract:
Using the quantum kinetic equation for electrons interacting with acoustic phonon, the density of the constant current associated with the drag of charge carriers in cylindrical quantum wire by a linearly polarized electromagnetic wave, a DC electric field and a laser radiation field is calculated. The density of the constant current is studied as a function of the frequency of electromagnetic wave, as well as the frequency of laser field and the basic elements of quantum wire with a parabolic potential. The analytic expression of the constant current density is numerically evaluated and plotted for a specific quantum wires GaAs/AlGaAs to show the dependence of the constant current density on above parameters. All these results of quantum wire compared with bulk semiconductors and superlattices to show the difference.Keywords: The photon-drag effect, the constant current density, quantum wire, parabolic potential
Procedia PDF Downloads 4253531 Dairy Products on the Algerian Market: Proportion of Imitation and Degree of Processing
Authors: Bentayeb-Ait Lounis Saïda, Cheref Zahia, Cherifi Thizi, Ri Kahina Bahmed, Kahina Hallali Yasmine Abdellaoui, Kenza Adli
Abstract:
Algeria is the leading consumer of dairy products in North Africa. This is a fact. However, the nutritional quality of the latter remains unknown. The aim of this study is to characterise the dairy products available on the Algerian market in order to assess whether they constitute a healthy and safe choice. To do this, it collected data on the labelling of 390 dairy products, including cheese, yoghurt, UHT milk and milk drinks, infant formula and dairy creams. We assessed their degree of processing according to the NOVA classification, as well as the proportion of imitation products. The study was carried out between March 2020 and August 2023. The results show that 88% are ultra-processed; 84% for 'cheese', 92% for dairy creams, 92% for 'yoghurt', 100% for infant formula, 92% for margarines and 36% for UHT milk/dairy drinks. As for imitation/analogue dairy products, the study revealed the following proportions: 100% for infant formula, 78% for butter/margarine, 18% for UHT milk/milk-based drinks, 54% for cheese, 2% for camembert and 75% for dairy cream. The harmful effects of consuming ultra-processed products on long-term health are increasingly documented in dozens of publications. The findings of this study sound the alarm about the health risks to which Algerian consumers are exposed. Various scientific, economic and industrial bodies need to be involved in order to safeguard consumer health in both the short and long term. Food awareness and education campaigns should be organised.Keywords: dairy, UPF, NOVA, yoghurt, cheese
Procedia PDF Downloads 423530 Agile Smartphone Porting and App Integration of Signal Processing Algorithms Obtained through Rapid Development
Authors: Marvin Chibuzo Offiah, Susanne Rosenthal, Markus Borschbach
Abstract:
Certain research projects in Computer Science often involve research on existing signal processing algorithms and developing improvements on them. Research budgets are usually limited, hence there is limited time for implementing the algorithms from scratch. It is therefore common practice, to use implementations provided by other researchers as a template. These are most commonly provided in a rapid development, i.e. 4th generation, programming language, usually Matlab. Rapid development is a common method in Computer Science research for quickly implementing and testing new developed algorithms, which is also a common task within agile project organization. The growing relevance of mobile devices in the computer market also gives rise to the need to demonstrate the successful executability and performance measurement of these algorithms on a mobile device operating system and processor, particularly on a smartphone. Open mobile systems such as Android, are most suitable for this task, which is to be performed most efficiently. Furthermore, efficiently implementing an interaction between the algorithm and a graphical user interface (GUI) that runs exclusively on the mobile device is necessary in cases where the project’s goal statement also includes such a task. This paper examines different proposed solutions for porting computer algorithms obtained through rapid development into a GUI-based smartphone Android app and evaluates their feasibilities. Accordingly, the feasible methods are tested and a short success report is given for each tested method.Keywords: SMARTNAVI, Smartphone, App, Programming languages, Rapid Development, MATLAB, Octave, C/C++, Java, Android, NDK, SDK, Linux, Ubuntu, Emulation, GUI
Procedia PDF Downloads 4803529 Redefining Solar Generation Estimation: A Comprehensive Analysis of Real Utility Advanced Metering Infrastructure (AMI) Data from Various Projects in New York
Authors: Haowei Lu, Anaya Aaron
Abstract:
Understanding historical solar generation and forecasting future solar generation from interconnected Distributed Energy Resources (DER) is crucial for utility planning and interconnection studies. The existing methodology, which relies on solar radiation, weather data, and common inverter models, is becoming less accurate. Rapid advancements in DER technologies have resulted in more diverse project sites, deviating from common patterns due to various factors such as DC/AC ratio, solar panel performance, tilt angle, and the presence of DC-coupled battery energy storage systems. In this paper, the authors review 10,000 DER projects within the system and analyze the Advanced Metering Infrastructure (AMI) data for various types to demonstrate the impact of different parameters. An updated methodology is proposed for redefining historical and future solar generation in distribution feeders.Keywords: photovoltaic system, solar energy, fluctuations, energy storage, uncertainty
Procedia PDF Downloads 373528 An Analytical Systematic Design Approach to Evaluate Ballistic Performance of Armour Grade AA7075 Aluminium Alloy Using Friction Stir Processing
Authors: Lahari Ramya Pa, Sudhakar Ib, Madhu Vc, Madhusudhan Reddy Gd, Srinivasa Rao E.
Abstract:
Selection of suitable armor materials for defense applications is very crucial with respect to increasing mobility of the systems as well as maintaining safety. Therefore, determining the material with the lowest possible areal density that resists the predefined threat successfully is required in armor design studies. A number of light metal and alloys are come in to forefront especially to substitute the armour grade steels. AA5083 aluminium alloy which fit in to the military standards imposed by USA army is foremost nonferrous alloy to consider for possible replacement of steel to increase the mobility of armour vehicles and enhance fuel economy. Growing need of AA5083 aluminium alloy paves a way to develop supplement aluminium alloys maintaining the military standards. It has been witnessed that AA 2xxx aluminium alloy, AA6xxx aluminium alloy and AA7xxx aluminium alloy are the potential material to supplement AA5083 aluminium alloy. Among those cited aluminium series alloys AA7xxx aluminium alloy (heat treatable) possesses high strength and can compete with armour grade steels. Earlier investigations revealed that layering of AA7xxx aluminium alloy can prevent spalling of rear portion of armour during ballistic impacts. Hence, present investigation deals with fabrication of hard layer (made of boron carbide) i.e. layer on AA 7075 aluminium alloy using friction stir processing with an intention of blunting the projectile in the initial impact and backing tough portion(AA7xxx aluminium alloy) to dissipate residual kinetic energy. An analytical approach has been adopted to unfold the ballistic performance of projectile. Penetration of projectile inside the armour has been resolved by considering by strain energy model analysis. Perforation shearing areas i.e. interface of projectile and armour is taken in to account for evaluation of penetration inside the armour. Fabricated surface composites (targets) were tested as per the military standard (JIS.0108.01) in a ballistic testing tunnel at Defence Metallurgical Research Laboratory (DMRL), Hyderabad in standardized testing conditions. Analytical results were well validated with experimental obtained one.Keywords: AA7075 aluminium alloy, friction stir processing, boron carbide, ballistic performance, target
Procedia PDF Downloads 3333527 LES Simulation of a Thermal Plasma Jet with Modeled Anode Arc Attachment Effects
Authors: N. Agon, T. Kavka, J. Vierendeels, M. Hrabovský, G. Van Oost
Abstract:
A plasma jet model was developed with a rigorous method for calculating the thermophysical properties of the gas mixture without mixing rules. A simplified model approach to account for the anode effects was incorporated in this model to allow the valorization of the simulations with experimental results. The radial heat transfer was under-predicted by the model because of the limitations of the radiation model, but the calculated evolution of centerline temperature, velocity and gas composition downstream of the torch exit corresponded well with the measured values. The CFD modeling of thermal plasmas is either focused on development of the plasma arc or the flow of the plasma jet outside of the plasma torch. In the former case, the Maxwell equations are coupled with the Navier-Stokes equations to account for electromagnetic effects which control the movements of the anode arc attachment. In plasma jet simulations, however, the computational domain starts from the exit nozzle of the plasma torch and the influence of the arc attachment fluctuations on the plasma jet flow field is not included in the calculations. In that case, the thermal plasma flow is described by temperature, velocity and concentration profiles at the torch exit nozzle and no electromagnetic effects are taken into account. This simplified approach is widely used in literature and generally acceptable for plasma torches with a circular anode inside the torch chamber. The unique DC hybrid water/gas-stabilized plasma torch developed at the Institute of Plasma Physics of the Czech Academy of Sciences on the other hand, consists of a rotating anode disk, located outside of the torch chamber. Neglecting the effects of the anode arc attachment downstream of the torch exit nozzle leads to erroneous predictions of the flow field. With the simplified approach introduced in this model, the Joule heating between the exit nozzle and the anode attachment position of the plasma arc is modeled by a volume heat source and the jet deflection caused by the anode processes by a momentum source at the anode surface. Furthermore, radiation effects are included by the net emission coefficient (NEC) method and diffusion is modeled with the combined diffusion coefficient method. The time-averaged simulation results are compared with numerous experimental measurements. The radial temperature profiles were obtained by spectroscopic measurements at different axial positions downstream of the exit nozzle. The velocity profiles were evaluated from the time-dependent evolution of flow structures, recorded by photodiode arrays. The shape of the plasma jet was compared with charge-coupled device (CCD) camera pictures. In the cooler regions, the temperature was measured by enthalpy probe downstream of the exit nozzle and by thermocouples in radial direction around the torch nozzle. The model results correspond well with the experimental measurements. The decrease in centerline temperature and velocity is predicted within an acceptable range and the shape of the jet closely resembles the jet structure in the recorded images. The temperatures at the edge of the jet are underestimated due to the absence of radial radiative heat transfer in the model.Keywords: anode arc attachment, CFD modeling, experimental comparison, thermal plasma jet
Procedia PDF Downloads 3693526 Optimization of Extraction Conditions and Characteristics of Scale collagen From Sardine: Sardina pilchardus
Authors: F. Bellali, M. Kharroubi, M. Loutfi, N.Bourhim
Abstract:
In Morocco, fish processing industry is an important source income for a large amount of byproducts including skins, bones, heads, guts and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Scales from Sardina plichardus resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic and bio medical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. Moreover, the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The basic principle of RSM is to determinate model equations that describe interrelations between the independent variables and the dependent variables.Keywords: Sardina pilchardus, scales, valorization, collagen extraction, response surface methodology
Procedia PDF Downloads 4213525 Microbial Dynamics and Sensory Traits of Spanish- and Greek-Style Table Olives (Olea europaea L. cv. Ascolana tenera) Fermented with Sea Fennel (Crithmum maritimum L.)
Authors: Antonietta Maoloni, Federica Cardinali, Vesna Milanović, Andrea Osimani, Ilario Ferrocino, Maria Rita Corvaglia, Luca Cocolin, Lucia Aquilanti
Abstract:
Table olives (Olea europaea L.) are among the most important fermented vegetables all over the world, while sea fennel (Crithmum maritimum L.) is an emerging food crop with interesting nutritional and sensory traits. Both of them are characterized by the presence of several bioactive compounds with potential beneficial health effects, thus representing two valuable substrates for the manufacture of innovative vegetable-based preserves. Given these premises, the present study was aimed at exploring the co-fermentation of table olives and sea fennel to produce new high-value preserves. Spanish style or Greek style processing method and the use of a multiple strain starter were explored. The preserves were evaluated for their microbial dynamics and key sensory traits. During the fermentation, a progressive pH reduction was observed. Mesophilic lactobacilli, mesophilic lactococci, and yeasts were the main microbial groups at the end of the fermentation, whereas Enterobacteriaceae decreased during fermentation. An evolution of the microbiota was revealed by metataxonomic analysis, with Lactiplantibacillus plantarum dominating in the late stage of fermentation, irrespective of processing method and use of the starter. Greek style preserves resulted in more crunchy and less fibrous than Spanish style one and were preferred by trained panelists.Keywords: lactic acid bacteria, Lactiplantibacillus plantarum, metataxonomy, panel test, rock samphire
Procedia PDF Downloads 1323524 Coupled Effect of Pulsed Current and Stress State on Fracture Behavior of Ultrathin Superalloy Sheet
Authors: Shuangxin Wu
Abstract:
Superalloy ultra-thin-walled components occupy a considerable proportion of aero engines and play an increasingly important role in structural weight reduction and performance improvement. To solve problems such as high deformation resistance and poor formability at room temperature, the introduction of pulse current in the processing process can improve the plasticity of metal materials, but the influence mechanism of pulse current on the forming limit of superalloy ultra-thin sheet is not clear, which is of great significance for determining the material processing window and improving the micro-forming process. The effect of pulse current on the microstructure evolution of superalloy thin plates was observed by optical microscopy (OM) and X-ray diffraction topography (XRT) by applying pulse current to GH3039 with a thickness of 0.2mm under plane strain and uniaxial tensile states. Compared with the specimen without pulse current applied at the same temperature, the internal void volume fraction is significantly reduced, reflecting the non-thermal effect of pulse current on the growth of micro-pores. ED (electrically deforming) specimens have larger and deeper dimples, but the elongation is not significantly improved because the pulse current promotes the void coalescence process, resulting in material fracture. The electro-plastic phenomenon is more obvious in the plane strain state, which is closely related to the effect of stress triaxial degree on the void evolution under pulsed current.Keywords: pulse current, superalloy, ductile fracture, void damage
Procedia PDF Downloads 793523 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit
Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana
Abstract:
Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification
Procedia PDF Downloads 1603522 LEDs Based Indoor Positioning by Distances Derivation from Lambertian Illumination Model
Authors: Yan-Ren Chen, Jenn-Kaie Lain
Abstract:
This paper proposes a novel indoor positioning algorithm based on visible light communications, implemented by light-emitting diode fixtures. In the proposed positioning algorithm, distances between light-emitting diode fixtures and mobile terminal are derived from the assumption of ideal Lambertian optic radiation model, and Trilateration positioning method is proceeded immediately to get the coordinates of mobile terminal. The proposed positioning algorithm directly obtains distance information from the optical signal modeling, and therefore, statistical distribution of received signal strength at different positions in interior space has no need to be pre-established. Numerically, simulation results have shown that the proposed indoor positioning algorithm can provide accurate location coordinates estimation.Keywords: indoor positioning, received signal strength, trilateration, visible light communications
Procedia PDF Downloads 4163521 Effect of Fermentation Time on Some Functional Properties of Moringa (Moringa oleifera) Seed Flour
Authors: Ocheme B. Ocheme, Omobolanle O. Oloyede, S. James, Eleojo V. Akpa
Abstract:
The effect of fermentation time on some functional properties of Moringa (Moringa oleifera) seed flour was examined. Fermentation, an effective processing method used to improve nutritional quality of plant foods, tends to affect the characteristics of food components and their behaviour in food systems just like other processing methods. Hence the need for this study. Moringa seeds were fermented naturally by soaking in potable water and allowing it to stand for 12, 24, 48 and 72 hours. At the end of fermentation, the seeds were oven dried at 600C for 12 hours and then milled into flour. Flour obtained from unfermented seeds served as control: hence a total of five flour samples. The functional properties were analyzed using standard methods. Fermentation significantly (p<0.05) increased the water holding capacity of Moringa seed flour from 0.86g/g - 2.31g/g. The highest value was observed after 48 hours of fermentation The same trend was observed for oil absorption capacity with values between 0.87 and 1.91g/g. Flour from unfermented Moringa seeds had a bulk density of 0.60g/cm3 which was significantly (p<0.05) higher than the bulk densities of flours from seeds fermented for 12, 24 and 48. Fermentation significantly (p<0.05) decreased the dispersibility of Moringa seed flours from 36% to 21, 24, 29 and 20% after 12, 24, 48 and 72 hours of fermentation respectively. The flours’ emulsifying capacities increased significantly (p<0.05) with increasing fermentation time with values between 50 – 68%. The flour obtained from seeds fermented for 12 hours had a significantly (p<0.05) higher foaming capacity of 16% while the flour obtained from seeds fermented for 0, 24 and 72 hours had the least foaming capacities of 9%. Flours from seeds fermented for 12 and 48 hours had better functional properties than flours from seeds fermented for 24 and 72 hours.Keywords: fermentation, flour, functional properties, Moringa
Procedia PDF Downloads 6953520 Solving the Nonlinear Heat Conduction in a Spherical Coordinate with Electrical Simulation
Authors: A. M. Gheitaghy, H. Saffari, G. Q. Zhang
Abstract:
Numerical approach based on the electrical simulation method is proposed to solve a nonlinear transient heat conduction problem with nonlinear boundary for a spherical body. This problem represents a strong nonlinearity in both the governing equation for temperature dependent thermal property and the boundary condition for combined convective and radiative cooling. By analysing the equivalent electrical model using the electrical circuit simulation program HSPICE, transient temperature and heat flux distributions at sphere can be obtained easily and fast. The solutions clearly illustrate the effect of the radiation-conduction parameter Nrc, the Biot number and the linear coefficient of temperature dependent conductivity and heat capacity. On comparing the results with corresponding numerical solutions, the accuracy and efficiency of this computational method are found to be good.Keywords: convective and radiative boundary, electrical simulation method, nonlinear heat conduction, spherical coordinate
Procedia PDF Downloads 3363519 Distributed Cost-Based Scheduling in Cloud Computing Environment
Authors: Rupali, Anil Kumar Jaiswal
Abstract:
Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc. Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively. Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.Keywords: physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model
Procedia PDF Downloads 1703518 MIMO UWB Antenna for Exploring Body Centric Communication
Authors: Osama Aziz, Hamza Ahmad, Muhibur Rahman
Abstract:
The performance of wireless communication systems has been suggested to be improved by UWB MIMO antenna systems. However, creating a successful UWB MIMO antenna is a difficult undertaking that calls for resolving a number of design issues, including radiation efficiency, size, and frequency range. This study's primary objective is to create a novel, highly effective, small-sized, ultra-wideband (UWB) multiple-input multiple-output (MIMO) antenna and investigate its potential applications in body-centric communication. Two radiating elements, shared ground plane, circular stubs, and t-shaped isolation elements are used to achieve the MIMO antenna. Outstanding multiplexing efficiency, significant peak gain across the entire UWB frequency spectrum, extremely low mutual coupling (S21=-16 dB), high diversity gain (DG>9), and low envelop correlation are achieved. The proposed antenna will be one of the promising candidates for body centric communication.Keywords: UWB communication, UWB MIMO antennas, body-centric communication, diversity gain
Procedia PDF Downloads 793517 A Hebbian Neural Network Model of the Stroop Effect
Authors: Vadim Kulikov
Abstract:
The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop
Procedia PDF Downloads 271