Search results for: multiple layers nonwoven
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6004

Search results for: multiple layers nonwoven

5314 A Highly Efficient Broadcast Algorithm for Computer Networks

Authors: Ganesh Nandakumaran, Mehmet Karaata

Abstract:

A wave is a distributed execution, often made up of a broadcast phase followed by a feedback phase, requiring the participation of all the system processes before a particular event called decision is taken. Wave algorithms with one initiator such as the 1-wave algorithm have been shown to be very efficient for broadcasting messages in tree networks. Extensions of this algorithm broadcasting a sequence of waves using a single initiator have been implemented in algorithms such as the m-wave algorithm. However as the network size increases, having a single initiator adversely affects the message delivery times to nodes further away from the initiator. As a remedy, broadcast waves can be allowed to be initiated by multiple initiator nodes distributed across the network to reduce the completion time of broadcasts. These waves initiated by one or more initiator processes form a collection of waves covering the entire network. Solutions to global-snapshots, distributed broadcast and various synchronization problems can be solved efficiently using waves with multiple concurrent initiators. In this paper, we propose the first stabilizing multi-wave sequence algorithm implementing waves started by multiple initiator processes such that every process in the network receives at least one sequence of broadcasts. Due to being stabilizing, the proposed algorithm can withstand transient faults and do not require initialization. We view a fault as a transient fault if it perturbs the configuration of the system but not its program.

Keywords: distributed computing, multi-node broadcast, propagation of information with feedback and cleaning (PFC), stabilization, wave algorithms

Procedia PDF Downloads 503
5313 Modifications in Design of Lap Joint of Fiber Metal Laminates

Authors: Shaher Bano, Samia Fida, Asif Israr

Abstract:

The continuous development and exploitation of materials and designs have diverted the attention of the world towards the use of robust composite materials known as fiber-metal laminates in many high-performance applications. The hybrid structure of fiber metal laminates makes them a material of choice for various applications such as aircraft skin panels, fuselage floorings, door panels and other load bearing applications. The synergistic effect of properties of metals and fibers reinforced laminates are responsible for their high damage tolerance as the metal element provides better fatigue and impact properties, while high stiffness and better corrosion properties are inherited from the fiber reinforced matrix systems. They are mostly used as a layered structure in different joint configurations such as lap and but joints. The FML layers are usually bonded with each other using either mechanical fasteners or adhesive bonds. This research work is also focused on modification of an adhesive bonded joint as a single lap joint of carbon fibers based CARALL FML has been modified to increase interlaminar shear strength and avoid delamination. For this purpose different joint modification techniques such as the introduction of spews and shoulder to modify the bond shape and use of nanofillers such as carbon nano-tubes as a reinforcement in the adhesive materials, have been utilized to improve shear strength of lap joint of the adhesively bonded FML layers. Both the simulation and experimental results showed that lap joint with spews and shoulders configuration have better properties due to stress distribution over a large area at the corner of the joint. The introduction of carbon nanotubes has also shown a positive effect on shear stress and joint strength as they act as reinforcement in the adhesive bond material.

Keywords: adhesive joint, Carbon Reinforced Aluminium Laminate (CARALL), fiber metal laminates, spews

Procedia PDF Downloads 296
5312 Vendor Selection and Supply Quotas Determination by Using Revised Weighting Method and Multi-Objective Programming Methods

Authors: Tunjo Perič, Marin Fatović

Abstract:

In this paper a new methodology for vendor selection and supply quotas determination (VSSQD) is proposed. The problem of VSSQD is solved by the model that combines revised weighting method for determining the objective function coefficients, and a multiple objective linear programming (MOLP) method based on the cooperative game theory for VSSQD. The criteria used for VSSQD are: (1) purchase costs and (2) product quality supplied by individual vendors. The proposed methodology is tested on the example of flour purchase for a bakery with two decision makers.

Keywords: cooperative game theory, multiple objective linear programming, revised weighting method, vendor selection

Procedia PDF Downloads 355
5311 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis

Authors: Alexander A. Tokmakov

Abstract:

Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.

Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins

Procedia PDF Downloads 415
5310 The Study on Enhanced Micro Climate of the Oyster Mushroom Cultivation House with Multi-Layered Shelves by Using Computational Fluid Dynamics Analysis in Winter

Authors: Sunghyoun Lee, Byeongkee Yu, Chanjung Lee, Yeongtaek Lim

Abstract:

Oyster mushrooms are one of the ingredients that Koreans prefer. The oyster mushroom cultivation house has multiple layers in order to increase the mushroom production per unit area. However, the growing shelves in the house act as obstacles and hinder the circulation of the interior air, which leads to the difference of cultivation environment between the upper part and lower part of the growing shelves. Due to this difference of environments, growth distinction occurs according to the area of the growing shelves. It is known that minute air circulation around the mushroom cap facilitates the metabolism of mushrooms and improves its quality. This study has utilized the computational fluid dynamics (CFD) program, that is, FLUENT R16, in order to analyze the improvement of the internal environment uniformity of the oyster mushroom cultivation house. The analyzed factors are velocity distribution, temperature distribution, and humidity distribution. In order to maintain the internal environment uniformity of the oyster mushroom cultivation house, it appeared that installing circulation fan at the upper part of the working passage towards the ceiling is effective. When all the environmental control equipment – unit cooler, inlet fan, outlet fan, air circulation fan, and humidifier - operated simultaneously, the RMS figure on the growing shelves appeared as follows: velocity 28.23%, temperature 30.47%, humidity 7.88%. However, when only unit cooler and air circulation fan operated, the RMS figure on the growing shelves appeared as follows: velocity 22.28%, temperature 0.87%, humidity 0.82%. Therefore, in order to maintain the internal environment uniformity of the mushroom cultivation house, reducing the overall operating time of inlet fan, outlet fan, and humidifier is needed, and managing the internal environment with unit cooler and air circulation fan appropriately is essential.

Keywords: air circulation fan, computational fluid dynamics, multi-layered shelves cultivation, oyster mushroom cultivation house

Procedia PDF Downloads 201
5309 TopClosure® of Large Abdominal Wall Defect Instead of Staged Hernia Repair as Part of Damage Control Laparotomy

Authors: Andriy Fedorenko

Abstract:

Background Early closure of the open abdomen is a priority after damage control laparotomy to prevent retraction of fascial layers and prevent hernia formation that requires definitive repair at a later stage. This substantially reduces the complications associated with ventral hernia formation for up to a year after initial surgery. TopClosure® is an innovative method that employs stress-relaxation and mechanical creep for skin stretching. Its use enables the primary closure of large abdominal wall defects and mitigates large ventral hernia formation. Materials and Methods A 7-year-old girl presented with severe blast injury. She underwent initial laparotomy in a facility within the conflict zone and was transferred in a state of septic shock to our facility for further care. Her abdominal injuries included liver lacerations, multiple perforations of the transverse colon and ileum, and a 8x16cm oblique abdominal wall defect. Further damage control laparotomy was performed with primary suture of the colon and ileum and temporary closure of the abdomen using a Bagota bag. Twelve hours later, negative pressure wound therapy (NPWT) was applied to the abdominal wound after relook laparotomy. Five days later, TopClosure® was applied to the lower part of the wound incorporating NPWT to the upper wound. Results The patient suffered leak from the colonic suture line and required relaparotomy. TopClosure® abdominal closure was achieved after every laparotomy. Conclusion TopClosure® utilizes the viscoelastic properties of the skin achieving full closure of the abdominal wall (including the fascia and skin),eliminating the need for prolonged NPWT, skin graft, and delayed ventral hernia repair surgery.

Keywords: topclosure, abdominal wall defect, hernia, damage control

Procedia PDF Downloads 74
5308 Adaptive Filtering in Subbands for Supervised Source Separation

Authors: Bruna Luisa Ramos Prado Vasques, Mariane Rembold Petraglia, Antonio Petraglia

Abstract:

This paper investigates MIMO (Multiple-Input Multiple-Output) adaptive filtering techniques for the application of supervised source separation in the context of convolutive mixtures. From the observation that there is correlation among the signals of the different mixtures, an improvement in the NSAF (Normalized Subband Adaptive Filter) algorithm is proposed in order to accelerate its convergence rate. Simulation results with mixtures of speech signals in reverberant environments show the superior performance of the proposed algorithm with respect to the performances of the NLMS (Normalized Least-Mean-Square) and conventional NSAF, considering both the convergence speed and SIR (Signal-to-Interference Ratio) after convergence.

Keywords: adaptive filtering, multi-rate processing, normalized subband adaptive filter, source separation

Procedia PDF Downloads 429
5307 Stand Alone Multiple Trough Solar Desalination with Heat Storage

Authors: Abderrahmane Diaf, Kamel Benabdellaziz

Abstract:

Remote arid areas of the vast expanses of the African deserts hold huge subterranean reserves of brackish water resources waiting for economic development. This work presents design guidelines as well as initial performance data of new autonomous solar desalination equipment which could help local communities produce their own fresh water using solar energy only and, why not, contribute to transforming desert lands into lush gardens. The output of solar distillation equipment is typically low and in the range of 3 l/m2/day on the average. This new design with an integrated, water-based, environmentally-friendly solar heat storage system produced 5 l/m2/day in early spring weather. Equipment output during summer exceeded 9 liters per m2 per day.

Keywords: multiple trough distillation, solar desalination, solar distillation with heat storage, water based heat storage system

Procedia PDF Downloads 436
5306 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion

Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut

Abstract:

This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.

Keywords: hub location problem, p-hub median problem, clustering, congestion

Procedia PDF Downloads 487
5305 The Relationship between Representational Conflicts, Generalization, and Encoding Requirements in an Instance Memory Network

Authors: Mathew Wakefield, Matthew Mitchell, Lisa Wise, Christopher McCarthy

Abstract:

The properties of memory representations in artificial neural networks have cognitive implications. Distributed representations that encode instances as a pattern of activity across layers of nodes afford memory compression and enforce the selection of a single point in instance space. These encoding schemes also appear to distort the representational space, as well as trading off the ability to validate that input information is within the bounds of past experience. In contrast, a localist representation which encodes some meaningful information into individual nodes in a network layer affords less memory compression while retaining the integrity of the representational space. This allows the validity of an input to be determined. The validity (or familiarity) of input along with the capacity of localist representation for multiple instance selections affords a memory sampling approach that dynamically balances the bias-variance trade-off. When the input is familiar, bias may be high by referring only to the most similar instances in memory. When the input is less familiar, variance can be increased by referring to more instances that capture a broader range of features. Using this approach in a localist instance memory network, an experiment demonstrates a relationship between representational conflict, generalization performance, and memorization demand. Relatively small sampling ranges produce the best performance on a classic machine learning dataset of visual objects. Combining memory validity with conflict detection produces a reliable confidence judgement that can separate responses with high and low error rates. Confidence can also be used to signal the need for supervisory input. Using this judgement, the need for supervised learning as well as memory encoding can be substantially reduced with only a trivial detriment to classification performance.

Keywords: artificial neural networks, representation, memory, conflict monitoring, confidence

Procedia PDF Downloads 126
5304 Applying Multiple Kinect on the Development of a Rapid 3D Mannequin Scan Platform

Authors: Shih-Wen Hsiao, Yi-Cheng Tsao

Abstract:

In the field of reverse engineering and creative industries, applying 3D scanning process to obtain geometric forms of the objects is a mature and common technique. For instance, organic objects such as faces and nonorganic objects such as products could be scanned to acquire the geometric information for further application. However, although the data resolution of 3D scanning device is increasing and there are more and more abundant complementary applications, the penetration rate of 3D scanning for the public is still limited by the relative high price of the devices. On the other hand, Kinect, released by Microsoft, is known for its powerful functions, considerably low price, and complete technology and database support. Therefore, related studies can be done with the applying of Kinect under acceptable cost and data precision. Due to the fact that Kinect utilizes optical mechanism to extracting depth information, limitations are found due to the reason of the straight path of the light. Thus, various angles are required sequentially to obtain the complete 3D information of the object when applying a single Kinect for 3D scanning. The integration process which combines the 3D data from different angles by certain algorithms is also required. This sequential scanning process costs much time and the complex integration process often encounter some technical problems. Therefore, this paper aimed to apply multiple Kinects simultaneously on the field of developing a rapid 3D mannequin scan platform and proposed suggestions on the number and angles of Kinects. In the content, a method of establishing the coordination based on the relation between mannequin and the specifications of Kinect is proposed, and a suggestion of angles and number of Kinects is also described. An experiment of applying multiple Kinect on the scanning of 3D mannequin is constructed by Microsoft API, and the results show that the time required for scanning and technical threshold can be reduced in the industries of fashion and garment design.

Keywords: 3D scan, depth sensor, fashion and garment design, mannequin, multiple Kinect sensor

Procedia PDF Downloads 363
5303 Non-Methane Hydrocarbons Emission during the Photocopying Process

Authors: Kiurski S. Jelena, Aksentijević M. Snežana, Kecić S. Vesna, Oros B. Ivana

Abstract:

The prosperity of electronic equipment in photocopying environment not only has improved work efficiency, but also has changed indoor air quality. Considering the number of photocopying employed, indoor air quality might be worse than in general office environments. Determining the contribution from any type of equipment to indoor air pollution is a complex matter. Non-methane hydrocarbons are known to have an important role of air quality due to their high reactivity. The presence of hazardous pollutants in indoor air has been detected in one photocopying shop in Novi Sad, Serbia. Air samples were collected and analyzed for five days, during 8-hr working time in three-time intervals, whereas three different sampling points were determined. Using multiple linear regression model and software package STATISTICA 10 the concentrations of occupational hazards and micro-climates parameters were mutually correlated. Based on the obtained multiple coefficients of determination (0.3751, 0.2389, and 0.1975), a weak positive correlation between the observed variables was determined. Small values of parameter F indicated that there was no statistically significant difference between the concentration levels of non-methane hydrocarbons and micro-climates parameters. The results showed that variable could be presented by the general regression model: y = b0 + b1xi1+ b2xi2. Obtained regression equations allow to measure the quantitative agreement between the variation of variables and thus obtain more accurate knowledge of their mutual relations.

Keywords: non-methane hydrocarbons, photocopying process, multiple regression analysis, indoor air quality, pollutant emission

Procedia PDF Downloads 376
5302 Performance Comparison of Joint Diagonalization Structure (JDS) Method and Wideband MUSIC Method

Authors: Sandeep Santosh, O. P. Sahu

Abstract:

We simulate an efficient multiple wideband and nonstationary source localization algorithm by exploiting both the non-stationarity of the signals and the array geometric information.This algorithm is based on joint diagonalization structure (JDS) of a set of short time power spectrum matrices at different time instants of each frequency bin. JDS can be used for quick and accurate multiple non-stationary source localization. The JDS algorithm is a one stage process i.e it directly searches the Direction of arrivals (DOAs) over the continuous location parameter space. The JDS method requires that the number of sensors is not less than the number of sources. By observing the simulation results, one can conclude that the JDS method can localize two sources when their difference is not less than 7 degree but the Wideband MUSIC is able to localize two sources for difference of 18 degree.

Keywords: joint diagonalization structure (JDS), wideband direction of arrival (DOA), wideband MUSIC

Procedia PDF Downloads 463
5301 The Construction of the Semigroup Which Is Chernoff Equivalent to Statistical Mixture of Quantizations for the Case of the Harmonic Oscillator

Authors: Leonid Borisov, Yuri Orlov

Abstract:

We obtain explicit formulas of finitely multiple approximations of the equilibrium density matrix for the case of the harmonic oscillator using Chernoff's theorem and the notion of semigroup which is Chernoff equivalent to average semigroup. Also we found explicit formulas for the corresponding approximate Wigner functions and average values of the observable. We consider a superposition of τ -quantizations representing a wide class of linear quantizations. We show that the convergence of the approximations of the average values of the observable is not uniform with respect to the Gibbs parameter. This does not allow to represent approximate expression as the sum of the exact limits and small deviations evenly throughout the temperature range with a given order of approximation.

Keywords: Chernoff theorem, Feynman formulas, finitely multiple approximation, harmonic oscillator, Wigner function

Procedia PDF Downloads 435
5300 Design and Implementation of Smart Watch Textile Antenna for Wi-Fi Bio-Medical Applications in Millimetric Wave Band

Authors: M. G. Ghanem, A. M. M. A. Allam, Diaa E. Fawzy, Mehmet Faruk Cengiz

Abstract:

This paper is devoted to the design and implementation of a smartwatch textile antenna for Wi-Fi bio-medical applications in millimetric wave bands. The antenna is implemented on a leather textile-based substrate to be embedded in a smartwatch. It enables the watch to pick Wi-Fi signals without the need to be connected to a mobile through Bluetooth. It operates at 60 GHz or WiGig (Wireless Gigabit Alliance) band with a wide band for higher rate applications. It also could be implemented over many stratified layers of the body organisms to be used in the diagnosis of many diseases like diabetes and cancer. The structure is designed and simulated using CST (Studio Suite) program. The wearable patch antenna has an octagon shape, and it is implemented on leather material that acts as a flexible substrate with a size of 5.632 x 6.4 x 2 mm3, a relative permittivity of 2.95, and a loss tangent of 0.006. The feeding is carried out using differential feed (discrete port in CST). The work provides five antenna implementations; antenna without ground, a ground is added at the back of the antenna in order to increase the antenna gain, the substrate dimensions are increased to 15 x 30 mm2 to resemble the real hand watch size, layers of skin and fat are added under the ground of the antenna to study the effect of human body tissues human on the antenna performance. Finally, the whole structure is bent. It is found that the antenna can achieve a simulated peak realized gain in dB of 5.68, 7.28, 6.15, 3.03, and 4.37 for antenna without ground, antenna with the ground, antenna with larger substrate dimensions, antenna with skin and fat, and bent structure, respectively. The antenna with ground exhibits high gain; while adding the human organisms absorption, the gain is degraded because of human absorption. The bent structure contributes to higher gain.

Keywords: bio medical engineering, millimetric wave, smart watch, textile antennas, Wi-Fi

Procedia PDF Downloads 116
5299 Smoking and Alcohol Consumption Predicts Multiple Head and Neck Cancers

Authors: Kim Kennedy, Daren Gibson, Stephanie Flukes, Chandra Diwakarla, Lisa Spalding, Leanne Pilkington, Andrew Redfern

Abstract:

Introduction: It is well known that patients with Head and Neck Cancer (HNC) are at increased risk of subsequent head and neck cancers due to various aetiologies. Aim: We sought to determine the factors contributing to an increased risk of subsequent HNC primaries, and also to evaluate whether Aboriginal patients are at increased risk. Methods: We performed a retrospective cohort analysis of 320 HNC patients from a single centre in Western Australia, identifying 80 Aboriginal patients and 240 non-Aboriginal patients matched on a 1:3 ratio by site, histology, rurality, and age. We collected patient data including smoking and alcohol consumption, tumour and treatment data, and data on subsequent HNC primaries. Results: A subsequent HNC primary was seen in 37 patients (11.6%) overall. There was no significant difference in the rate of second primary HNCs between Aboriginal patients (12.5%) and nonAboriginal patients (11.2%) (p=0.408). Subsequent HNCs, were strongly associated with smoking and alcohol consumption however, with 95% of patients with a second primary being ever-smokers, and 54% of patients with a second primary having a history of excessive alcohol consumption. In the 37 patients with multiple HNC primaries, there were a total of 57 HNCs, with 29 patients having two primaries, six patients having 3 HNC primaries, one patient with four, and one with six. 54 out of the 57 cancers were in ever smokers (94.7%). There were only two multiple HNC primaries in a never smoker, non-drinker, and these cases were of unknown etiology with HPV/p16 status unknown in both cases. In the whole study population, there were 32 HPV-positive HNCs, and 67 p16-positive HNCs, with only two 2 nd HNCs in a p16-positive case, giving a rate of 3% in the p16+ population, which is actually much lower than the rate of second primaries seen in the overall population (11.6%), and was highest in the p16-negative population (15.7%). This suggests that p16-positivity is not a strong risk factor for subsequent primaries, and in fact p16-negativity appeared to be associated with increased risk, however this data is limited by the large number of patients without documented p16 status (45.3% overall, 12% for oropharyngeal, and 59.6% for oral cavity primaries had unknown p16 status). Summary: Subsequent HNC primaries were strongly associated with smoking and alcohol excess. Second and later HNC primaries did not appear to occur at increased rates in Aboriginal patients compared with non-Aboriginal patients, and p16-positivity did not predict increased risk, however p16-negativity was associated with an increased risk of subsequent HNCs.

Keywords: head and neck cancer, multiple primaries, aboriginal, p16 status, smoking, alcohol

Procedia PDF Downloads 65
5298 Comparative Analysis of Single vs. Multiple gRNA on NGN3 Expression Using a Controllable dCas9-VP192 Activator (CRISPRa)

Authors: Nicholas Abdilmasih, Habib Rezanejad

Abstract:

This study investigates the gene expression induction efficiency of single versus multiple guide RNAs (gRNAs) targeting the NGN3 gene using the CRISPR activation system in HEK293 cells. Our study aimed to contribute to optimizing the use of gRNAs in gene therapy applications, particularly in treating diseases like diabetes, where precise gene regulation is essential. The experimental design involves culturing HEK293 cells, and once they reach approximately 70-80% confluence, cells were transfected with specific gRNAs targeting the NGN3 gene promoter. Specific gRNAs targeting the NGN3 promoter that was previously designed, incorporated into plasmid clone cassettes and introduced into HEK293 cells through co-transfection using pCAG-DDdCas9-VP192-EGFP transactivator. Post-transfection, cell viability, and fluorescence were monitored to assess transfection efficiency. RNA was extracted, converted to cDNA, and analyzed via qPCR to measure NGN3 expression levels. Results indicated that specific combinations of fewer gRNAs led to higher NGN3 activation compared to multiple gRNAs, challenging the assumption that more gRNAs result in synergistic gene activation. These findings suggest that optimized gRNA combinations can enhance gene therapy efficiency, potentially leading to more effective treatments for conditions like diabetes.

Keywords: CRISPR activation, Diabetes mellitus, gene therapy, guide RNA, Neurogenin3

Procedia PDF Downloads 15
5297 A Review of the Parameters Used in Gateway Selection Schemes for Internet Connected MANETs

Authors: Zainab S. Mahmood, Aisha H. Hashim, Wan Haslina Hassan, Farhat Anwar

Abstract:

The wide use of the internet-based applications bring many challenges to the researchers to guarantee the continuity of the connections needed by the mobile hosts and provide reliable Internet access for them. One of proposed solutions by Internet Engineering Task Force (IETF) is to connect the local, multi-hop, and infrastructure-less Mobile Ad hoc Network (MANET) with Internet structure. This connection is done through multi-interface devices known as Internet Gateways. Many issues are related to this connection like gateway discovery, hand off, address auto-configuration and selecting the optimum gateway when multiple gateways exist. Many studies were done proposing gateway selection schemes with a single selection criterion or weighted multiple criteria. In this research, a review of some of these schemes is done showing the differences, the features, the challenges and the drawbacks of each of them.

Keywords: Internet Gateway, MANET, mobility, selection criteria

Procedia PDF Downloads 419
5296 Form of Distribution of Traffic Accident and Environment Factors of Road Affecting of Traffic Accident in Dusit District, Only Area Responsible of Samsen Police Station

Authors: Musthaya Patchanee

Abstract:

This research aimed to study form of traffic distribution and environmental factors of road that affect traffic accidents in Dusit District, only areas responsible of Samsen Police Station. Data used in this analysis is the secondary data of traffic accident case from year 2011. Observed area units are 15 traffic lines that are under responsible of Samsen Police Station. Technique and method used are the Cartographic Method, the Correlation Analysis, and the Multiple Regression Analysis. The results of form of traffic accidents show that, the Samsen Road area had most traffic accidents (24.29%), second was Rachvithi Road (18.10%), third was Sukhothai Road (15.71%), fourth was Rachasrima Road (12.38%), and fifth was Amnuaysongkram Road (7.62%). The result from Dusit District, only areas responsible of Samsen police station, has suggested that the scale of accidents have high positive correlation with statistic significant at level 0.05 and the frequency of travel (r=0.857). Traffic intersection point (r=0.763)and traffic control equipments (r=0.713) are relevant factors respectively. By using the Multiple Regression Analysis, travel frequency is the only one that has considerable influences on traffic accidents in Dusit district only Samsen Police Station area. Also, a factor in frequency of travel can explain the change in traffic accidents scale to 73.40 (R2 = 0.734). By using the Multiple regression summation from analysis was Y ̂=-7.977+0.044X6.

Keywords: form of traffic distribution, environmental factors of road, traffic accidents, Dusit district

Procedia PDF Downloads 386
5295 Post-Quantum Resistant Edge Authentication in Large Scale Industrial Internet of Things Environments Using Aggregated Local Knowledge and Consistent Triangulation

Authors: C. P. Autry, A. W. Roscoe, Mykhailo Magal

Abstract:

We discuss the theoretical model underlying 2BPA (two-band peer authentication), a practical alternative to conventional authentication of entities and data in IoT. In essence, this involves assembling a virtual map of authentication assets in the network, typically leading to many paths of confirmation between any pair of entities. This map is continuously updated, confirmed, and evaluated. The value of authentication along multiple disjoint paths becomes very clear, and we require analogues of triangulation to extend authentication along extended paths and deliver it along all possible paths. We discover that if an attacker wants to make an honest node falsely believe she has authenticated another, then the length of the authentication paths is of little importance. This is because optimal attack strategies correspond to minimal cuts in the authentication graph and do not contain multiple edges on the same path. The authentication provided by disjoint paths normally is additive (in entropy).

Keywords: authentication, edge computing, industrial IoT, post-quantum resistance

Procedia PDF Downloads 193
5294 Secondary Charged Fragments Tracking for On-Line Beam Range Monitoring in Particle Therapy

Authors: G. Traini, G. Battistoni, F. Collamati, E. De Lucia, R. Faccini, C. Mancini-Terracciano, M. Marafini, I. Mattei, S. Muraro, A. Sarti, A. Sciubba, E. Solfaroli Camillocci, M. Toppi, S. M. Valle, C. Voena, V. Patera

Abstract:

In Particle Therapy (PT) treatments a large amount of secondary particles, whose emission point is correlated to the dose released in the crossed tissues, is produced. The measurement of the secondary charged fragments component could represent a valid technique to monitor the beam range during the PT treatments, that is a still missing item in the clinical practice. A sub-millimetrical precision on the beam range measurement is required to significantly optimise the technique and to improve the treatment quality. In this contribution, a detector, named Dose Profiler (DP), is presented. It is specifically planned to monitor on-line the beam range exploiting the secondary charged particles produced in PT Carbon ions treatment. In particular, the DP is designed to track the secondary fragments emitted at large angles with respect to the beam direction (mainly protons), with the aim to reconstruct the spatial coordinates of the fragment emission point extrapolating the measured track toward the beam axis. The DP is currently under development within of the INSIDE collaboration (Innovative Solutions for In-beam Dosimetry in hadrontherapy). The tracker is made by six layers (20 × 20 cm²) of BCF-12 square scintillating fibres (500 μm) coupled to Silicon Photo-Multipliers, followed by two plastic scintillator layers of 6 mm thickness. A system of front-end boards based on FPGAs arranged around the detector provides the data acquisition. The detector characterization with cosmic rays is currently undergoing, and a data taking campaign with protons will take place in May 2017. The DP design and the performances measured with using MIPs and protons beam will be reviewed.

Keywords: fragmentation, monitoring, particle therapy, tracking

Procedia PDF Downloads 230
5293 Supplier Risk Management: A Multivariate Statistical Modelling and Portfolio Optimization Based Approach for Supplier Delivery Performance Development

Authors: Jiahui Yang, John Quigley, Lesley Walls

Abstract:

In this paper, the authors develop a stochastic model regarding the investment in supplier delivery performance development from a buyer’s perspective. The authors propose a multivariate model through a Multinomial-Dirichlet distribution within an Empirical Bayesian inference framework, representing both the epistemic and aleatory uncertainties in deliveries. A closed form solution is obtained and the lower and upper bound for both optimal investment level and expected profit under uncertainty are derived. The theoretical properties provide decision makers with useful insights regarding supplier delivery performance improvement problems where multiple delivery statuses are involved. The authors also extend the model from a single supplier investment into a supplier portfolio, using a Lagrangian method to obtain a theoretical expression for an optimal investment level and overall expected profit. The model enables a buyer to know how the marginal expected profit/investment level of each supplier changes with respect to the budget and which supplier should be invested in when additional budget is available. An application of this model is illustrated in a simulation study. Overall, the main contribution of this study is to provide an optimal investment decision making framework for supplier development, taking into account multiple delivery statuses as well as multiple projects.

Keywords: decision making, empirical bayesian, portfolio optimization, supplier development, supply chain management

Procedia PDF Downloads 284
5292 A Methodology for Automatic Diversification of Document Categories

Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.

Keywords: big data analysis, document classification, multi-category, text mining, topic analysis

Procedia PDF Downloads 269
5291 Analyze Long-Term Shoreline Change at Yi-Lan Coast, Taiwan Using Multiple Sources

Authors: Geng-Gui Wang, Chia-Hao Chang, Jee-Cheng Wu

Abstract:

A shoreline is a line where a body of water and the shore meet. It provides economic and social security to coastal habitations. However, shorelines face multiple threats due to both natural processes and man-made effects because of disasters, rapid urbanization, industrialization, and sand deposition and erosion, etc. In this study, we analyzed multi-temporal satellite images of the Yilan coast, Taiwan from 1978 to 2016, using the United States Geological Survey (USGS) Digital Shoreline Analysis System (DSAS), weather information (as rainfall records and typhoon routes), and man-made construction project data to explore the causes of shoreline changes. The results showed that the shoreline at Yilan coast is greatly influenced by typhoons and anthropogenic interventions.

Keywords: shoreline change, multi-temporal satellite, digital shoreline analysis system, DSAS, Yi-Lan coast

Procedia PDF Downloads 159
5290 Sexual Risk Behaviours of High School Students in an Urban Town of Cameroon

Authors: Elvis Enowbeyang Tarkang

Abstract:

Background: Since students in high schools in Cameroon fall within the age group hardest hit by HIV/AIDS, it is assumed that these students might be exposed to sexual risk behaviours. Sexual risk behaviours include engaging in unprotected sexual intercourse, early sexual debut, multiple sexual partners and coerced or forced sex, and these behaviours might predispose youth to HIV transmission. However, little has been explored on the sexual risk behaviours of high school learners in Cameroon. This study aimed at examining the sexual risk behaviours of high school students in an urban town of Cameroon. Method: A quantitative cross sectional design was adopted, using a self-administered questionnaire to collect data from a disproportional stratified simple random sample of 480 (240 male and 240 female) grade 10 to grade 12 students from two participating secondary school in Limbe in the Southwest region of Cameroon August 2014. Descriptive and Chi square statistics were calculated using statistical Package for Social Sciences (SPSS) version 20 software program at the level 0.05. Results: Majority of the respondents, 63.4% reported being sexually active, of whom only 33.2% used condoms consistently. Up to 37% of the sexually active respondents had multiple sexual partners in the past one year before the study, while 23% had multiple sexual partners during the study period. The mean age of first sex was 15.4 years. Among Christians, Pentecostals, 17 (58.6%) were more likely to have experienced sexual coercion than non-Pentecostals, 111 (42.2%) (p= 0.000). Christians, 41 (10.3%) were more likely to have been forced into first sex than Muslims, 0 (0.0%); while among the Christians, Pentecostals, 6 (15.0%) were more likely to have been forced into first sex than non-Pentecostals, 35 (10.9%) (p=0.004). Among the Christians, Pentecostals, 16 (66.7%) were more likely to have experienced sex by age 16 years than non-Pentecostals, 125 (64.1%) (p= 0.000). Students who lived in rented places, 32 (22.7%) were more likely to have had multiple sexual partners than those who lived in their parents’ houses, 35 (18.1%) (p= 0.000). Males, 36 (16.0%) were likely to have had multiple concurrent sexual partners than females, 14 (6.0%) (p=0.002). Students who used condoms consistently, 25 (33.3%) were more likely to have a higher perception of risk of contracting HIV than those who did not use condoms consistently, 38 (29.9%) (p=0.002). Students who lived in their parents’ houses, 35 (35.4%) were more likely to use condoms consistently during sex, than those who lived in rented places, 31 (29.8%) (p=0.021). Students who passed their examinations, 57 (30.9%) were more likely to have used condoms consistently than those with low academic profiles, 24 (27.9%) (p= 0.034). Conclusions and Recommendations: Gender, lack of parental control, religion, academic profile, poverty, place of residence and perception of risk of HIV infection were the main factors associated with sexual risk behaviours among students in urban Cameroon. The findings indicate that sexual risk behaviours exist among high school students in Limbe urban town of Cameroon. There is need for campaigns and interventions to bring about sexual behaviour change.

Keywords: Cameroon, high school students, HIV/AIDS, Limbe urban town, sexual risk behaviours

Procedia PDF Downloads 326
5289 Studies on Space-Based Laser Targeting System for the Removal of Orbital Space Debris

Authors: Krima M. Rohela, Raja Sabarinath Sundaralingam

Abstract:

Humans have been launching rockets since the beginning of the space age in the late 1950s. We have come a long way since then, and the success rate for the launch of rockets has increased considerably. With every successful launch, there is a large amount of junk or debris which is released into the upper layers of the atmosphere. Space debris has been a huge concern for a very long time now. This includes the rocket shells released from the launch and the parts of defunct satellites. Some of this junk will come to fall towards the Earth and burn in the atmosphere. But most of the junk goes into orbit around the Earth, and they remain in orbits for at least 100 years. This can cause a lot of problems to other functioning satellites and may affect the future manned missions to space. The main concern of the space-debris is the increase in space activities, which leads to risks of collisions if not taken care of soon. These collisions may result in what is known as Kessler Syndrome. This debris can be removed by a space-based laser targeting system. Hence, the matter is investigated and discussed. The first step in this involves launching a satellite with a high-power laser device into space, above the debris belt. Then the target material is ablated with a focussed laser beam. This step of the process is highly dependent on the attitude and orientation of the debris with respect to the Earth and the device. The laser beam will cause a jet of vapour and plasma to be expelled from the material. Hence, the force is applied in the opposite direction, and in accordance with Newton’s third law of motion, this will cause the material to move towards the Earth and get pulled down due to gravity, where it will get disintegrated in the upper layers of the atmosphere. The larger pieces of the debris can be directed towards the oceans. This method of removal of the orbital debris will enable safer passage for future human-crewed missions into space.

Keywords: altitude, Kessler syndrome, laser ablation, Newton’s third law of motion, satellites, Space debris

Procedia PDF Downloads 143
5288 Vibration Imaging Method for Vibrating Objects with Translation

Authors: Kohei Shimasaki, Tomoaki Okamura, Idaku Ishii

Abstract:

We propose a vibration imaging method for high frame rate (HFR)-video-based localization of vibrating objects with large translations. When the ratio of the translation speed of a target to its vibration frequency is large, obtaining its frequency response in image intensities becomes difficult because one or no waves are observable at the same pixel. Our method can precisely localize moving objects with vibration by virtually translating multiple image sequences for pixel-level short-time Fourier transform to observe multiple waves at the same pixel. The effectiveness of the proposed method is demonstrated by analyzing several HFR videos of flying insects in real scenarios.

Keywords: HFR video analysis, pixel-level vibration source localization, short-time Fourier transform, virtual translation

Procedia PDF Downloads 103
5287 Parallel Random Number Generation for the Modern Supercomputer Architectures

Authors: Roman Snytsar

Abstract:

Pseudo-random numbers are often used in scientific computing such as the Monte Carlo Simulations or the Quantum Inspired Optimization. Requirements for a parallel random number generator running in the modern multi-core vector environment are more stringent than those for sequential random number generators. As well as passing the usual quality tests, the output of the parallel random number generator must be verifiable and reproducible throughout the concurrent execution. We propose a family of vectorized Permuted Congruential Generators. Implementations are available for multiple modern vector modern computer architectures. Besides demonstrating good single core performance, the generators scale easily across many processor cores and multiple distributed nodes. We provide performance and parallel speedup analysis and comparisons between the implementations.

Keywords: pseudo-random numbers, quantum optimization, SIMD, parallel computing

Procedia PDF Downloads 114
5286 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking

Authors: Noga Bregman

Abstract:

Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.

Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves

Procedia PDF Downloads 39
5285 Indoor Air Pollution of the Flexographic Printing Environment

Authors: Jelena S. Kiurski, Vesna S. Kecić, Snežana M. Aksentijević

Abstract:

The identification and evaluation of organic and inorganic pollutants were performed in a flexographic facility in Novi Sad, Serbia. Air samples were collected and analyzed in situ, during 4-hours working time at five sampling points by the mobile gas chromatograph and ozonometer at the printing of collagen casing. Experimental results showed that the concentrations of isopropyl alcohol, acetone, total volatile organic compounds and ozone varied during the sampling times. The highest average concentrations of 94.80 ppm and 102.57 ppm were achieved at 200 minutes from starting the production for isopropyl alcohol and total volatile organic compounds, respectively. The mutual dependences between target hazardous and microclimate parameters were confirmed using a multiple linear regression model with software package STATISTICA 10. Obtained multiple coefficients of determination in the case of ozone and acetone (0.507 and 0.589) with microclimate parameters indicated a moderate correlation between the observed variables. However, a strong positive correlation was obtained for isopropyl alcohol and total volatile organic compounds (0.760 and 0.852) with microclimate parameters. Higher values of parameter F than Fcritical for all examined dependences indicated the existence of statistically significant difference between the concentration levels of target pollutants and microclimates parameters. Given that, the microclimate parameters significantly affect the emission of investigated gases and the application of eco-friendly materials in production process present a necessity.

Keywords: flexographic printing, indoor air, multiple regression analysis, pollution emission

Procedia PDF Downloads 191