Search results for: multiple layers nonwoven
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6052

Search results for: multiple layers nonwoven

5362 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System

Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem

Abstract:

Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.

Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter

Procedia PDF Downloads 76
5361 Efficiency Improvement of REV-Method for Calibration of Phased Array Antennas

Authors: Daniel Hristov

Abstract:

The paper describes the principle of operation, simulation and physical validation of method for simultaneous acquisition of gain and phase states of multiple antenna elements and the corresponding feed lines across a Phased Array Antenna (PAA). The derived values for gain and phase are used for PAA-calibration. The method utilizes the Rotating-Element Electric- Field Vector (REV) principle currently used for gain and phase state estimation of single antenna element across an active antenna aperture. A significant reduction of procedure execution time is achieved with simultaneous setting of different phase delays to multiple phase shifters, followed by a single power measurement. The initial gain and phase states are calculated using spectral and correlation analysis of the measured power series.

Keywords: antenna, antenna arrays, calibration, phase measurement, power measurement

Procedia PDF Downloads 137
5360 The Study on Enhanced Micro Climate of the Oyster Mushroom Cultivation House with Multi-Layered Shelves by Using Computational Fluid Dynamics Analysis in Winter

Authors: Sunghyoun Lee, Byeongkee Yu, Chanjung Lee, Yeongtaek Lim

Abstract:

Oyster mushrooms are one of the ingredients that Koreans prefer. The oyster mushroom cultivation house has multiple layers in order to increase the mushroom production per unit area. However, the growing shelves in the house act as obstacles and hinder the circulation of the interior air, which leads to the difference of cultivation environment between the upper part and lower part of the growing shelves. Due to this difference of environments, growth distinction occurs according to the area of the growing shelves. It is known that minute air circulation around the mushroom cap facilitates the metabolism of mushrooms and improves its quality. This study has utilized the computational fluid dynamics (CFD) program, that is, FLUENT R16, in order to analyze the improvement of the internal environment uniformity of the oyster mushroom cultivation house. The analyzed factors are velocity distribution, temperature distribution, and humidity distribution. In order to maintain the internal environment uniformity of the oyster mushroom cultivation house, it appeared that installing circulation fan at the upper part of the working passage towards the ceiling is effective. When all the environmental control equipment – unit cooler, inlet fan, outlet fan, air circulation fan, and humidifier - operated simultaneously, the RMS figure on the growing shelves appeared as follows: velocity 28.23%, temperature 30.47%, humidity 7.88%. However, when only unit cooler and air circulation fan operated, the RMS figure on the growing shelves appeared as follows: velocity 22.28%, temperature 0.87%, humidity 0.82%. Therefore, in order to maintain the internal environment uniformity of the mushroom cultivation house, reducing the overall operating time of inlet fan, outlet fan, and humidifier is needed, and managing the internal environment with unit cooler and air circulation fan appropriately is essential.

Keywords: air circulation fan, computational fluid dynamics, multi-layered shelves cultivation, oyster mushroom cultivation house

Procedia PDF Downloads 206
5359 TopClosure® of Large Abdominal Wall Defect Instead of Staged Hernia Repair as Part of Damage Control Laparotomy

Authors: Andriy Fedorenko

Abstract:

Background Early closure of the open abdomen is a priority after damage control laparotomy to prevent retraction of fascial layers and prevent hernia formation that requires definitive repair at a later stage. This substantially reduces the complications associated with ventral hernia formation for up to a year after initial surgery. TopClosure® is an innovative method that employs stress-relaxation and mechanical creep for skin stretching. Its use enables the primary closure of large abdominal wall defects and mitigates large ventral hernia formation. Materials and Methods A 7-year-old girl presented with severe blast injury. She underwent initial laparotomy in a facility within the conflict zone and was transferred in a state of septic shock to our facility for further care. Her abdominal injuries included liver lacerations, multiple perforations of the transverse colon and ileum, and a 8x16cm oblique abdominal wall defect. Further damage control laparotomy was performed with primary suture of the colon and ileum and temporary closure of the abdomen using a Bagota bag. Twelve hours later, negative pressure wound therapy (NPWT) was applied to the abdominal wound after relook laparotomy. Five days later, TopClosure® was applied to the lower part of the wound incorporating NPWT to the upper wound. Results The patient suffered leak from the colonic suture line and required relaparotomy. TopClosure® abdominal closure was achieved after every laparotomy. Conclusion TopClosure® utilizes the viscoelastic properties of the skin achieving full closure of the abdominal wall (including the fascia and skin),eliminating the need for prolonged NPWT, skin graft, and delayed ventral hernia repair surgery.

Keywords: topclosure, abdominal wall defect, hernia, damage control

Procedia PDF Downloads 79
5358 Simulation Analysis of Wavelength/Time/Space Codes Using CSRZ and DPSK-RZ Formats for Fiber-Optic CDMA Systems

Authors: Jaswinder Singh

Abstract:

In this paper, comparative analysis is carried out to study the performance of wavelength/time/space optical CDMA codes using two well-known formats; those are CSRZ and DPSK-RZ using RSoft’s OptSIM. The analysis is carried out under the real-like scenario considering the presence of various non-linear effects such as XPM, SPM, SRS, SBS and FWM. Fiber dispersion and the multiple access interference are also considered. The codes used in this analysis are 3-D wavelength/time/space codes. These are converted into 2-D wavelength-time codes so that their requirement of space couplers and fiber ribbons is eliminated. Under the conditions simulated, this is found that CSRZ performs better than DPSK-RZ for fiber-optic CDMA applications.

Keywords: Optical CDMA, Multiple access interference (MAI), CSRZ, DPSK-RZ

Procedia PDF Downloads 645
5357 A Highly Efficient Broadcast Algorithm for Computer Networks

Authors: Ganesh Nandakumaran, Mehmet Karaata

Abstract:

A wave is a distributed execution, often made up of a broadcast phase followed by a feedback phase, requiring the participation of all the system processes before a particular event called decision is taken. Wave algorithms with one initiator such as the 1-wave algorithm have been shown to be very efficient for broadcasting messages in tree networks. Extensions of this algorithm broadcasting a sequence of waves using a single initiator have been implemented in algorithms such as the m-wave algorithm. However as the network size increases, having a single initiator adversely affects the message delivery times to nodes further away from the initiator. As a remedy, broadcast waves can be allowed to be initiated by multiple initiator nodes distributed across the network to reduce the completion time of broadcasts. These waves initiated by one or more initiator processes form a collection of waves covering the entire network. Solutions to global-snapshots, distributed broadcast and various synchronization problems can be solved efficiently using waves with multiple concurrent initiators. In this paper, we propose the first stabilizing multi-wave sequence algorithm implementing waves started by multiple initiator processes such that every process in the network receives at least one sequence of broadcasts. Due to being stabilizing, the proposed algorithm can withstand transient faults and do not require initialization. We view a fault as a transient fault if it perturbs the configuration of the system but not its program.

Keywords: distributed computing, multi-node broadcast, propagation of information with feedback and cleaning (PFC), stabilization, wave algorithms

Procedia PDF Downloads 504
5356 Vendor Selection and Supply Quotas Determination by Using Revised Weighting Method and Multi-Objective Programming Methods

Authors: Tunjo Perič, Marin Fatović

Abstract:

In this paper a new methodology for vendor selection and supply quotas determination (VSSQD) is proposed. The problem of VSSQD is solved by the model that combines revised weighting method for determining the objective function coefficients, and a multiple objective linear programming (MOLP) method based on the cooperative game theory for VSSQD. The criteria used for VSSQD are: (1) purchase costs and (2) product quality supplied by individual vendors. The proposed methodology is tested on the example of flour purchase for a bakery with two decision makers.

Keywords: cooperative game theory, multiple objective linear programming, revised weighting method, vendor selection

Procedia PDF Downloads 358
5355 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis

Authors: Alexander A. Tokmakov

Abstract:

Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.

Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins

Procedia PDF Downloads 419
5354 Adaptive Filtering in Subbands for Supervised Source Separation

Authors: Bruna Luisa Ramos Prado Vasques, Mariane Rembold Petraglia, Antonio Petraglia

Abstract:

This paper investigates MIMO (Multiple-Input Multiple-Output) adaptive filtering techniques for the application of supervised source separation in the context of convolutive mixtures. From the observation that there is correlation among the signals of the different mixtures, an improvement in the NSAF (Normalized Subband Adaptive Filter) algorithm is proposed in order to accelerate its convergence rate. Simulation results with mixtures of speech signals in reverberant environments show the superior performance of the proposed algorithm with respect to the performances of the NLMS (Normalized Least-Mean-Square) and conventional NSAF, considering both the convergence speed and SIR (Signal-to-Interference Ratio) after convergence.

Keywords: adaptive filtering, multi-rate processing, normalized subband adaptive filter, source separation

Procedia PDF Downloads 435
5353 The Relationship between Representational Conflicts, Generalization, and Encoding Requirements in an Instance Memory Network

Authors: Mathew Wakefield, Matthew Mitchell, Lisa Wise, Christopher McCarthy

Abstract:

The properties of memory representations in artificial neural networks have cognitive implications. Distributed representations that encode instances as a pattern of activity across layers of nodes afford memory compression and enforce the selection of a single point in instance space. These encoding schemes also appear to distort the representational space, as well as trading off the ability to validate that input information is within the bounds of past experience. In contrast, a localist representation which encodes some meaningful information into individual nodes in a network layer affords less memory compression while retaining the integrity of the representational space. This allows the validity of an input to be determined. The validity (or familiarity) of input along with the capacity of localist representation for multiple instance selections affords a memory sampling approach that dynamically balances the bias-variance trade-off. When the input is familiar, bias may be high by referring only to the most similar instances in memory. When the input is less familiar, variance can be increased by referring to more instances that capture a broader range of features. Using this approach in a localist instance memory network, an experiment demonstrates a relationship between representational conflict, generalization performance, and memorization demand. Relatively small sampling ranges produce the best performance on a classic machine learning dataset of visual objects. Combining memory validity with conflict detection produces a reliable confidence judgement that can separate responses with high and low error rates. Confidence can also be used to signal the need for supervisory input. Using this judgement, the need for supervised learning as well as memory encoding can be substantially reduced with only a trivial detriment to classification performance.

Keywords: artificial neural networks, representation, memory, conflict monitoring, confidence

Procedia PDF Downloads 127
5352 Surgical Outcome of Heavy Silicone Oil in Rhegmatogenous Retinal Detachment

Authors: Pheeraphat Ussadamongkol, Suthasinee Sinawat

Abstract:

Objective: The purpose of this study is to evaluate the anatomical and visual outcomes associated with the use of heavy silicone oil (HSO) during pars plana vitrectomy (PPV) in patients with rhegmatogenous retinal detachment (RRD). Materials and methods: A Total of 66 eyes of 66 patients with RRD patients who underwent PPV with HSO from 2018-2023 were included in this retrospective study. Risk factors of surgical outcomes were also investigated. Results: The mean age of the recruited patients was 55.26 ± 13.05 years. The most common diagnosis was recurrent RRD, with 43 patients (65.15%), and the majority of these patients (81.39%) had a history of multiple vitreoretinal surgeries. Inferior breaks and PVR grade ≧ C were present in 65.15% and 42.42% of cases, respectively. The mean duration of HSO tamponade was 7.77+5.19 months. The retinal attachment rate after surgery was 71.21%, with a final attachment rate of 87.88%. The mean final VA was 1.62 ± 1.11 logMAR. 54.54% of patients could achieve a final visual acuity (VA)  6/60. Multivariate analysis revealed that proliferative vitreoretinopathy (PVR) and multiple breaks were significantly associated with retinal redetachment, while initial good VA (  6/60) was associated with good visual outcome ( 6/60). The most common complications were glaucoma (30.3%) and epimacular membrane (7.58%). Conclusion: The use of heavy silicone oil in pars plana vitrectomy for rhegmatogenous retinal detachment yields favorable anatomical and visual outcomes. Factors associated with retinal redetachment are proliferative vitreoretinopathy and multiple breaks. Good initial VA can predict good visual outcomes.

Keywords: rhegmatogenous retinal detachment, heavy silicone oil, surgical outcome, visual outcome, risk factors

Procedia PDF Downloads 7
5351 Stand Alone Multiple Trough Solar Desalination with Heat Storage

Authors: Abderrahmane Diaf, Kamel Benabdellaziz

Abstract:

Remote arid areas of the vast expanses of the African deserts hold huge subterranean reserves of brackish water resources waiting for economic development. This work presents design guidelines as well as initial performance data of new autonomous solar desalination equipment which could help local communities produce their own fresh water using solar energy only and, why not, contribute to transforming desert lands into lush gardens. The output of solar distillation equipment is typically low and in the range of 3 l/m2/day on the average. This new design with an integrated, water-based, environmentally-friendly solar heat storage system produced 5 l/m2/day in early spring weather. Equipment output during summer exceeded 9 liters per m2 per day.

Keywords: multiple trough distillation, solar desalination, solar distillation with heat storage, water based heat storage system

Procedia PDF Downloads 440
5350 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion

Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut

Abstract:

This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.

Keywords: hub location problem, p-hub median problem, clustering, congestion

Procedia PDF Downloads 492
5349 Design and Implementation of Smart Watch Textile Antenna for Wi-Fi Bio-Medical Applications in Millimetric Wave Band

Authors: M. G. Ghanem, A. M. M. A. Allam, Diaa E. Fawzy, Mehmet Faruk Cengiz

Abstract:

This paper is devoted to the design and implementation of a smartwatch textile antenna for Wi-Fi bio-medical applications in millimetric wave bands. The antenna is implemented on a leather textile-based substrate to be embedded in a smartwatch. It enables the watch to pick Wi-Fi signals without the need to be connected to a mobile through Bluetooth. It operates at 60 GHz or WiGig (Wireless Gigabit Alliance) band with a wide band for higher rate applications. It also could be implemented over many stratified layers of the body organisms to be used in the diagnosis of many diseases like diabetes and cancer. The structure is designed and simulated using CST (Studio Suite) program. The wearable patch antenna has an octagon shape, and it is implemented on leather material that acts as a flexible substrate with a size of 5.632 x 6.4 x 2 mm3, a relative permittivity of 2.95, and a loss tangent of 0.006. The feeding is carried out using differential feed (discrete port in CST). The work provides five antenna implementations; antenna without ground, a ground is added at the back of the antenna in order to increase the antenna gain, the substrate dimensions are increased to 15 x 30 mm2 to resemble the real hand watch size, layers of skin and fat are added under the ground of the antenna to study the effect of human body tissues human on the antenna performance. Finally, the whole structure is bent. It is found that the antenna can achieve a simulated peak realized gain in dB of 5.68, 7.28, 6.15, 3.03, and 4.37 for antenna without ground, antenna with the ground, antenna with larger substrate dimensions, antenna with skin and fat, and bent structure, respectively. The antenna with ground exhibits high gain; while adding the human organisms absorption, the gain is degraded because of human absorption. The bent structure contributes to higher gain.

Keywords: bio medical engineering, millimetric wave, smart watch, textile antennas, Wi-Fi

Procedia PDF Downloads 121
5348 Applying Multiple Kinect on the Development of a Rapid 3D Mannequin Scan Platform

Authors: Shih-Wen Hsiao, Yi-Cheng Tsao

Abstract:

In the field of reverse engineering and creative industries, applying 3D scanning process to obtain geometric forms of the objects is a mature and common technique. For instance, organic objects such as faces and nonorganic objects such as products could be scanned to acquire the geometric information for further application. However, although the data resolution of 3D scanning device is increasing and there are more and more abundant complementary applications, the penetration rate of 3D scanning for the public is still limited by the relative high price of the devices. On the other hand, Kinect, released by Microsoft, is known for its powerful functions, considerably low price, and complete technology and database support. Therefore, related studies can be done with the applying of Kinect under acceptable cost and data precision. Due to the fact that Kinect utilizes optical mechanism to extracting depth information, limitations are found due to the reason of the straight path of the light. Thus, various angles are required sequentially to obtain the complete 3D information of the object when applying a single Kinect for 3D scanning. The integration process which combines the 3D data from different angles by certain algorithms is also required. This sequential scanning process costs much time and the complex integration process often encounter some technical problems. Therefore, this paper aimed to apply multiple Kinects simultaneously on the field of developing a rapid 3D mannequin scan platform and proposed suggestions on the number and angles of Kinects. In the content, a method of establishing the coordination based on the relation between mannequin and the specifications of Kinect is proposed, and a suggestion of angles and number of Kinects is also described. An experiment of applying multiple Kinect on the scanning of 3D mannequin is constructed by Microsoft API, and the results show that the time required for scanning and technical threshold can be reduced in the industries of fashion and garment design.

Keywords: 3D scan, depth sensor, fashion and garment design, mannequin, multiple Kinect sensor

Procedia PDF Downloads 366
5347 Non-Methane Hydrocarbons Emission during the Photocopying Process

Authors: Kiurski S. Jelena, Aksentijević M. Snežana, Kecić S. Vesna, Oros B. Ivana

Abstract:

The prosperity of electronic equipment in photocopying environment not only has improved work efficiency, but also has changed indoor air quality. Considering the number of photocopying employed, indoor air quality might be worse than in general office environments. Determining the contribution from any type of equipment to indoor air pollution is a complex matter. Non-methane hydrocarbons are known to have an important role of air quality due to their high reactivity. The presence of hazardous pollutants in indoor air has been detected in one photocopying shop in Novi Sad, Serbia. Air samples were collected and analyzed for five days, during 8-hr working time in three-time intervals, whereas three different sampling points were determined. Using multiple linear regression model and software package STATISTICA 10 the concentrations of occupational hazards and micro-climates parameters were mutually correlated. Based on the obtained multiple coefficients of determination (0.3751, 0.2389, and 0.1975), a weak positive correlation between the observed variables was determined. Small values of parameter F indicated that there was no statistically significant difference between the concentration levels of non-methane hydrocarbons and micro-climates parameters. The results showed that variable could be presented by the general regression model: y = b0 + b1xi1+ b2xi2. Obtained regression equations allow to measure the quantitative agreement between the variation of variables and thus obtain more accurate knowledge of their mutual relations.

Keywords: non-methane hydrocarbons, photocopying process, multiple regression analysis, indoor air quality, pollutant emission

Procedia PDF Downloads 378
5346 Performance Comparison of Joint Diagonalization Structure (JDS) Method and Wideband MUSIC Method

Authors: Sandeep Santosh, O. P. Sahu

Abstract:

We simulate an efficient multiple wideband and nonstationary source localization algorithm by exploiting both the non-stationarity of the signals and the array geometric information.This algorithm is based on joint diagonalization structure (JDS) of a set of short time power spectrum matrices at different time instants of each frequency bin. JDS can be used for quick and accurate multiple non-stationary source localization. The JDS algorithm is a one stage process i.e it directly searches the Direction of arrivals (DOAs) over the continuous location parameter space. The JDS method requires that the number of sensors is not less than the number of sources. By observing the simulation results, one can conclude that the JDS method can localize two sources when their difference is not less than 7 degree but the Wideband MUSIC is able to localize two sources for difference of 18 degree.

Keywords: joint diagonalization structure (JDS), wideband direction of arrival (DOA), wideband MUSIC

Procedia PDF Downloads 468
5345 The Construction of the Semigroup Which Is Chernoff Equivalent to Statistical Mixture of Quantizations for the Case of the Harmonic Oscillator

Authors: Leonid Borisov, Yuri Orlov

Abstract:

We obtain explicit formulas of finitely multiple approximations of the equilibrium density matrix for the case of the harmonic oscillator using Chernoff's theorem and the notion of semigroup which is Chernoff equivalent to average semigroup. Also we found explicit formulas for the corresponding approximate Wigner functions and average values of the observable. We consider a superposition of τ -quantizations representing a wide class of linear quantizations. We show that the convergence of the approximations of the average values of the observable is not uniform with respect to the Gibbs parameter. This does not allow to represent approximate expression as the sum of the exact limits and small deviations evenly throughout the temperature range with a given order of approximation.

Keywords: Chernoff theorem, Feynman formulas, finitely multiple approximation, harmonic oscillator, Wigner function

Procedia PDF Downloads 439
5344 Secondary Charged Fragments Tracking for On-Line Beam Range Monitoring in Particle Therapy

Authors: G. Traini, G. Battistoni, F. Collamati, E. De Lucia, R. Faccini, C. Mancini-Terracciano, M. Marafini, I. Mattei, S. Muraro, A. Sarti, A. Sciubba, E. Solfaroli Camillocci, M. Toppi, S. M. Valle, C. Voena, V. Patera

Abstract:

In Particle Therapy (PT) treatments a large amount of secondary particles, whose emission point is correlated to the dose released in the crossed tissues, is produced. The measurement of the secondary charged fragments component could represent a valid technique to monitor the beam range during the PT treatments, that is a still missing item in the clinical practice. A sub-millimetrical precision on the beam range measurement is required to significantly optimise the technique and to improve the treatment quality. In this contribution, a detector, named Dose Profiler (DP), is presented. It is specifically planned to monitor on-line the beam range exploiting the secondary charged particles produced in PT Carbon ions treatment. In particular, the DP is designed to track the secondary fragments emitted at large angles with respect to the beam direction (mainly protons), with the aim to reconstruct the spatial coordinates of the fragment emission point extrapolating the measured track toward the beam axis. The DP is currently under development within of the INSIDE collaboration (Innovative Solutions for In-beam Dosimetry in hadrontherapy). The tracker is made by six layers (20 × 20 cm²) of BCF-12 square scintillating fibres (500 μm) coupled to Silicon Photo-Multipliers, followed by two plastic scintillator layers of 6 mm thickness. A system of front-end boards based on FPGAs arranged around the detector provides the data acquisition. The detector characterization with cosmic rays is currently undergoing, and a data taking campaign with protons will take place in May 2017. The DP design and the performances measured with using MIPs and protons beam will be reviewed.

Keywords: fragmentation, monitoring, particle therapy, tracking

Procedia PDF Downloads 233
5343 Studies on Space-Based Laser Targeting System for the Removal of Orbital Space Debris

Authors: Krima M. Rohela, Raja Sabarinath Sundaralingam

Abstract:

Humans have been launching rockets since the beginning of the space age in the late 1950s. We have come a long way since then, and the success rate for the launch of rockets has increased considerably. With every successful launch, there is a large amount of junk or debris which is released into the upper layers of the atmosphere. Space debris has been a huge concern for a very long time now. This includes the rocket shells released from the launch and the parts of defunct satellites. Some of this junk will come to fall towards the Earth and burn in the atmosphere. But most of the junk goes into orbit around the Earth, and they remain in orbits for at least 100 years. This can cause a lot of problems to other functioning satellites and may affect the future manned missions to space. The main concern of the space-debris is the increase in space activities, which leads to risks of collisions if not taken care of soon. These collisions may result in what is known as Kessler Syndrome. This debris can be removed by a space-based laser targeting system. Hence, the matter is investigated and discussed. The first step in this involves launching a satellite with a high-power laser device into space, above the debris belt. Then the target material is ablated with a focussed laser beam. This step of the process is highly dependent on the attitude and orientation of the debris with respect to the Earth and the device. The laser beam will cause a jet of vapour and plasma to be expelled from the material. Hence, the force is applied in the opposite direction, and in accordance with Newton’s third law of motion, this will cause the material to move towards the Earth and get pulled down due to gravity, where it will get disintegrated in the upper layers of the atmosphere. The larger pieces of the debris can be directed towards the oceans. This method of removal of the orbital debris will enable safer passage for future human-crewed missions into space.

Keywords: altitude, Kessler syndrome, laser ablation, Newton’s third law of motion, satellites, Space debris

Procedia PDF Downloads 149
5342 Smoking and Alcohol Consumption Predicts Multiple Head and Neck Cancers

Authors: Kim Kennedy, Daren Gibson, Stephanie Flukes, Chandra Diwakarla, Lisa Spalding, Leanne Pilkington, Andrew Redfern

Abstract:

Introduction: It is well known that patients with Head and Neck Cancer (HNC) are at increased risk of subsequent head and neck cancers due to various aetiologies. Aim: We sought to determine the factors contributing to an increased risk of subsequent HNC primaries, and also to evaluate whether Aboriginal patients are at increased risk. Methods: We performed a retrospective cohort analysis of 320 HNC patients from a single centre in Western Australia, identifying 80 Aboriginal patients and 240 non-Aboriginal patients matched on a 1:3 ratio by site, histology, rurality, and age. We collected patient data including smoking and alcohol consumption, tumour and treatment data, and data on subsequent HNC primaries. Results: A subsequent HNC primary was seen in 37 patients (11.6%) overall. There was no significant difference in the rate of second primary HNCs between Aboriginal patients (12.5%) and nonAboriginal patients (11.2%) (p=0.408). Subsequent HNCs, were strongly associated with smoking and alcohol consumption however, with 95% of patients with a second primary being ever-smokers, and 54% of patients with a second primary having a history of excessive alcohol consumption. In the 37 patients with multiple HNC primaries, there were a total of 57 HNCs, with 29 patients having two primaries, six patients having 3 HNC primaries, one patient with four, and one with six. 54 out of the 57 cancers were in ever smokers (94.7%). There were only two multiple HNC primaries in a never smoker, non-drinker, and these cases were of unknown etiology with HPV/p16 status unknown in both cases. In the whole study population, there were 32 HPV-positive HNCs, and 67 p16-positive HNCs, with only two 2 nd HNCs in a p16-positive case, giving a rate of 3% in the p16+ population, which is actually much lower than the rate of second primaries seen in the overall population (11.6%), and was highest in the p16-negative population (15.7%). This suggests that p16-positivity is not a strong risk factor for subsequent primaries, and in fact p16-negativity appeared to be associated with increased risk, however this data is limited by the large number of patients without documented p16 status (45.3% overall, 12% for oropharyngeal, and 59.6% for oral cavity primaries had unknown p16 status). Summary: Subsequent HNC primaries were strongly associated with smoking and alcohol excess. Second and later HNC primaries did not appear to occur at increased rates in Aboriginal patients compared with non-Aboriginal patients, and p16-positivity did not predict increased risk, however p16-negativity was associated with an increased risk of subsequent HNCs.

Keywords: head and neck cancer, multiple primaries, aboriginal, p16 status, smoking, alcohol

Procedia PDF Downloads 69
5341 Comparative Analysis of Single vs. Multiple gRNA on NGN3 Expression Using a Controllable dCas9-VP192 Activator (CRISPRa)

Authors: Nicholas Abdilmasih, Habib Rezanejad

Abstract:

This study investigates the gene expression induction efficiency of single versus multiple guide RNAs (gRNAs) targeting the NGN3 gene using the CRISPR activation system in HEK293 cells. Our study aimed to contribute to optimizing the use of gRNAs in gene therapy applications, particularly in treating diseases like diabetes, where precise gene regulation is essential. The experimental design involves culturing HEK293 cells, and once they reach approximately 70-80% confluence, cells were transfected with specific gRNAs targeting the NGN3 gene promoter. Specific gRNAs targeting the NGN3 promoter that was previously designed, incorporated into plasmid clone cassettes and introduced into HEK293 cells through co-transfection using pCAG-DDdCas9-VP192-EGFP transactivator. Post-transfection, cell viability, and fluorescence were monitored to assess transfection efficiency. RNA was extracted, converted to cDNA, and analyzed via qPCR to measure NGN3 expression levels. Results indicated that specific combinations of fewer gRNAs led to higher NGN3 activation compared to multiple gRNAs, challenging the assumption that more gRNAs result in synergistic gene activation. These findings suggest that optimized gRNA combinations can enhance gene therapy efficiency, potentially leading to more effective treatments for conditions like diabetes.

Keywords: CRISPR activation, Diabetes mellitus, gene therapy, guide RNA, Neurogenin3

Procedia PDF Downloads 24
5340 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking

Authors: Noga Bregman

Abstract:

Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.

Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves

Procedia PDF Downloads 52
5339 A Review of the Parameters Used in Gateway Selection Schemes for Internet Connected MANETs

Authors: Zainab S. Mahmood, Aisha H. Hashim, Wan Haslina Hassan, Farhat Anwar

Abstract:

The wide use of the internet-based applications bring many challenges to the researchers to guarantee the continuity of the connections needed by the mobile hosts and provide reliable Internet access for them. One of proposed solutions by Internet Engineering Task Force (IETF) is to connect the local, multi-hop, and infrastructure-less Mobile Ad hoc Network (MANET) with Internet structure. This connection is done through multi-interface devices known as Internet Gateways. Many issues are related to this connection like gateway discovery, hand off, address auto-configuration and selecting the optimum gateway when multiple gateways exist. Many studies were done proposing gateway selection schemes with a single selection criterion or weighted multiple criteria. In this research, a review of some of these schemes is done showing the differences, the features, the challenges and the drawbacks of each of them.

Keywords: Internet Gateway, MANET, mobility, selection criteria

Procedia PDF Downloads 424
5338 Form of Distribution of Traffic Accident and Environment Factors of Road Affecting of Traffic Accident in Dusit District, Only Area Responsible of Samsen Police Station

Authors: Musthaya Patchanee

Abstract:

This research aimed to study form of traffic distribution and environmental factors of road that affect traffic accidents in Dusit District, only areas responsible of Samsen Police Station. Data used in this analysis is the secondary data of traffic accident case from year 2011. Observed area units are 15 traffic lines that are under responsible of Samsen Police Station. Technique and method used are the Cartographic Method, the Correlation Analysis, and the Multiple Regression Analysis. The results of form of traffic accidents show that, the Samsen Road area had most traffic accidents (24.29%), second was Rachvithi Road (18.10%), third was Sukhothai Road (15.71%), fourth was Rachasrima Road (12.38%), and fifth was Amnuaysongkram Road (7.62%). The result from Dusit District, only areas responsible of Samsen police station, has suggested that the scale of accidents have high positive correlation with statistic significant at level 0.05 and the frequency of travel (r=0.857). Traffic intersection point (r=0.763)and traffic control equipments (r=0.713) are relevant factors respectively. By using the Multiple Regression Analysis, travel frequency is the only one that has considerable influences on traffic accidents in Dusit district only Samsen Police Station area. Also, a factor in frequency of travel can explain the change in traffic accidents scale to 73.40 (R2 = 0.734). By using the Multiple regression summation from analysis was Y ̂=-7.977+0.044X6.

Keywords: form of traffic distribution, environmental factors of road, traffic accidents, Dusit district

Procedia PDF Downloads 391
5337 Post-Quantum Resistant Edge Authentication in Large Scale Industrial Internet of Things Environments Using Aggregated Local Knowledge and Consistent Triangulation

Authors: C. P. Autry, A. W. Roscoe, Mykhailo Magal

Abstract:

We discuss the theoretical model underlying 2BPA (two-band peer authentication), a practical alternative to conventional authentication of entities and data in IoT. In essence, this involves assembling a virtual map of authentication assets in the network, typically leading to many paths of confirmation between any pair of entities. This map is continuously updated, confirmed, and evaluated. The value of authentication along multiple disjoint paths becomes very clear, and we require analogues of triangulation to extend authentication along extended paths and deliver it along all possible paths. We discover that if an attacker wants to make an honest node falsely believe she has authenticated another, then the length of the authentication paths is of little importance. This is because optimal attack strategies correspond to minimal cuts in the authentication graph and do not contain multiple edges on the same path. The authentication provided by disjoint paths normally is additive (in entropy).

Keywords: authentication, edge computing, industrial IoT, post-quantum resistance

Procedia PDF Downloads 197
5336 Supplier Risk Management: A Multivariate Statistical Modelling and Portfolio Optimization Based Approach for Supplier Delivery Performance Development

Authors: Jiahui Yang, John Quigley, Lesley Walls

Abstract:

In this paper, the authors develop a stochastic model regarding the investment in supplier delivery performance development from a buyer’s perspective. The authors propose a multivariate model through a Multinomial-Dirichlet distribution within an Empirical Bayesian inference framework, representing both the epistemic and aleatory uncertainties in deliveries. A closed form solution is obtained and the lower and upper bound for both optimal investment level and expected profit under uncertainty are derived. The theoretical properties provide decision makers with useful insights regarding supplier delivery performance improvement problems where multiple delivery statuses are involved. The authors also extend the model from a single supplier investment into a supplier portfolio, using a Lagrangian method to obtain a theoretical expression for an optimal investment level and overall expected profit. The model enables a buyer to know how the marginal expected profit/investment level of each supplier changes with respect to the budget and which supplier should be invested in when additional budget is available. An application of this model is illustrated in a simulation study. Overall, the main contribution of this study is to provide an optimal investment decision making framework for supplier development, taking into account multiple delivery statuses as well as multiple projects.

Keywords: decision making, empirical bayesian, portfolio optimization, supplier development, supply chain management

Procedia PDF Downloads 288
5335 A Methodology for Automatic Diversification of Document Categories

Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.

Keywords: big data analysis, document classification, multi-category, text mining, topic analysis

Procedia PDF Downloads 272
5334 Structural Behavior of Subsoil Depending on Constitutive Model in Calculation Model of Pavement Structure-Subsoil System

Authors: M. Kadela

Abstract:

The load caused by the traffic movement should be transferred in the road constructions in a harmless way to the pavement as follows: − on the stiff upper layers of the structure (e.g. layers of asphalt: abrading and binding), and − through the layers of principal and secondary substructure, − on the subsoil, directly or through an improved subsoil layer. Reliable description of the interaction proceeding in a system “road construction – subsoil” should be in such case one of the basic requirements of the assessment of the size of internal forces of structure and its durability. Analyses of road constructions are based on: − elements of mechanics, which allows to create computational models, and − results of the experiments included in the criteria of fatigue life analyses. Above approach is a fundamental feature of commonly used mechanistic methods. They allow to use in the conducted evaluations of the fatigue life of structures arbitrarily complex numerical computational models. Considering the work of the system “road construction – subsoil”, it is commonly accepted that, as a result of repetitive loads on the subsoil under pavement, the growth of relatively small deformation in the initial phase is recognized, then this increase disappears, and the deformation takes the character completely reversible. The reliability of calculation model is combined with appropriate use (for a given type of analysis) of constitutive relationships. Phenomena occurring in the initial stage of the system “road construction – subsoil” is unfortunately difficult to interpret in the modeling process. The classic interpretation of the behavior of the material in the elastic-plastic model (e-p) is that elastic phase of the work (e) is undergoing to phase (e-p) by increasing the load (or growth of deformation in the damaging structure). The paper presents the essence of the calibration process of cooperating subsystem in the calculation model of the system “road construction – subsoil”, created for the mechanistic analysis. Calibration process was directed to show the impact of applied constitutive models on its deformation and stress response. The proper comparative base for assessing the reliability of created. This work was supported by the on-going research project “Stabilization of weak soil by application of layer of foamed concrete used in contact with subsoil” (LIDER/022/537/L-4/NCBR/2013) financed by The National Centre for Research and Development within the LIDER Programme. M. Kadela is with the Department of Building Construction Elements and Building Structures on Mining Areas, Building Research Institute, Silesian Branch, Katowice, Poland (phone: +48 32 730 29 47; fax: +48 32 730 25 22; e-mail: m.kadela@ itb.pl). models should be, however, the actual, monitored system “road construction – subsoil”. The paper presents too behavior of subsoil under cyclic load transmitted by pavement layers. The response of subsoil to cyclic load is recorded in situ by the observation system (sensors) installed on the testing ground prepared for this purpose, being a part of the test road near Katowice, in Poland. A different behavior of the homogeneous subsoil under pavement is observed for different seasons of the year, when pavement construction works as a flexible structure in summer, and as a rigid plate in winter. Albeit the observed character of subsoil response is the same regardless of the applied load and area values, this response can be divided into: - zone of indirect action of the applied load; this zone extends to the depth of 1,0 m under the pavement, - zone of a small strain, extending to about 2,0 m.

Keywords: road structure, constitutive model, calculation model, pavement, soil, FEA, response of soil, monitored system

Procedia PDF Downloads 357
5333 Analyze Long-Term Shoreline Change at Yi-Lan Coast, Taiwan Using Multiple Sources

Authors: Geng-Gui Wang, Chia-Hao Chang, Jee-Cheng Wu

Abstract:

A shoreline is a line where a body of water and the shore meet. It provides economic and social security to coastal habitations. However, shorelines face multiple threats due to both natural processes and man-made effects because of disasters, rapid urbanization, industrialization, and sand deposition and erosion, etc. In this study, we analyzed multi-temporal satellite images of the Yilan coast, Taiwan from 1978 to 2016, using the United States Geological Survey (USGS) Digital Shoreline Analysis System (DSAS), weather information (as rainfall records and typhoon routes), and man-made construction project data to explore the causes of shoreline changes. The results showed that the shoreline at Yilan coast is greatly influenced by typhoons and anthropogenic interventions.

Keywords: shoreline change, multi-temporal satellite, digital shoreline analysis system, DSAS, Yi-Lan coast

Procedia PDF Downloads 163