Search results for: automatic target recognition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5078

Search results for: automatic target recognition

3848 Laser-Hole Boring into Overdense Targets: A Detailed Study on Laser and Target Properties

Authors: Florian Wagner, Christoph Schmidt, Vincent Bagnoud

Abstract:

Understanding the interaction of ultra-intense laser pulses with overcritical targets is of major interest for many applications such as laser-driven ion acceleration, fast ignition in the frame of inertial confinement fusion or high harmonic generation and the creation of attosecond pulses. One particular aspect of this interaction is the shift of the critical surface, where the laser pulse is stopped and the absorption is at maximum, due to the radiation pressure induced by the laser pulse, also referred to as laser hole boring. We investigate laser-hole boring experimentally by measuring the backscattered spectrum which is doppler-broadened because of the movement of the reflecting surface. Using the high-power, high-energy laser system PHELIX in Darmstadt, we gathered an extensive set of data for different laser intensities ranging from 10^18 W/cm2 to 10^21 W/cm2, two different levels of the nanosecond temporal contrast (10^6 vs. 10^11), elliptical and linear polarization and varying target configurations. In this contribution we discuss how the maximum velocity of the critical surface depends on these parameters. In particular we show that by increasing the temporal contrast the maximum hole boring velocity is decreased by more than a factor of three. Our experimental findings are backed by a basic analytical model based on momentum and mass conservation as well as particle in cell simulations. These results are of particular importance for fast ignition since they contribute to a better understanding of the transport of the ignitor pulse into the overdense region.

Keywords: laser-hole boring, interaction of ultra-intense lasers with overcritical targets, fast ignition, relativistic laser motter interaction

Procedia PDF Downloads 405
3847 Monte Carlo Simulation Study on Improving the Flatting Filter-Free Radiotherapy Beam Quality Using Filters from Low- z Material

Authors: H. M. Alfrihidi, H.A. Albarakaty

Abstract:

Flattening filter-free (FFF) photon beam radiotherapy has increased in the last decade, which is enabled by advancements in treatment planning systems and radiation delivery techniques like multi-leave collimators. FFF beams have higher dose rates, which reduces treatment time. On the other hand, FFF beams have a higher surface dose, which is due to the loss of beam hardening effect caused by the presence of the flatting filter (FF). The possibility of improving FFF beam quality using filters from low-z materials such as steel and aluminium (Al) was investigated using Monte Carlo (MC) simulations. The attenuation coefficient of low-z materials for low-energy photons is higher than that of high-energy photons, which leads to the hardening of the FFF beam and, consequently, a reduction in the surface dose. BEAMnrc user code, based on Electron Gamma Shower (EGSnrc) MC code, is used to simulate the beam of a 6 MV True-Beam linac. A phase-space (phosphor) file provided by Varian Medical Systems was used as a radiation source in the simulation. This phosphor file was scored just above the jaws at 27.88 cm from the target. The linac from the jaw downward was constructed, and radiation passing was simulated and scored at 100 cm from the target. To study the effect of low-z filters, steel and Al filters with a thickness of 1 cm were added below the jaws, and the phosphor file was scored at 100 cm from the target. For comparison, the FF beam was simulated using a similar setup. (BEAM Data Processor (BEAMdp) is used to analyse the energy spectrum in the phosphorus files. Then, the dose distribution resulting from these beams was simulated in a homogeneous water phantom using DOSXYZnrc. The dose profile was evaluated according to the surface dose, the lateral dose distribution, and the percentage depth dose (PDD). The energy spectra of the beams show that the FFF beam is softer than the FF beam. The energy peaks for the FFF and FF beams are 0.525 MeV and 1.52 MeV, respectively. The FFF beam's energy peak becomes 1.1 MeV using a steel filter, while the Al filter does not affect the peak position. Steel and Al's filters reduced the surface dose by 5% and 1.7%, respectively. The dose at a depth of 10 cm (D10) rises by around 2% and 0.5% due to using a steel and Al filter, respectively. On the other hand, steel and Al filters reduce the dose rate of the FFF beam by 34% and 14%, respectively. However, their effect on the dose rate is less than that of the tungsten FF, which reduces the dose rate by about 60%. In conclusion, filters from low-z material decrease the surface dose and increase the D10 dose, allowing for a high-dose delivery to deep tumors with a low skin dose. Although using these filters affects the dose rate, this effect is much lower than the effect of the FF.

Keywords: flattening filter free, monte carlo, radiotherapy, surface dose

Procedia PDF Downloads 73
3846 Development of a Natural Anti-cancer Formulation Which Can Target Triple Negative Breast Cancer Stem Cells

Authors: Samashi Munaweera

Abstract:

Cancer stem cells (CSC) are responsible for the initiation, extensive proliferation and metastasis of cancer. CSCs, including breast cancer stem cells (bCSCs) have a capacity to generate chemo and radiotherapy resistance heterogeneous population of cells. Over-expressed ABCB1 has been reported as a main reason for drug resistance of CSCs via activating drug efflux pumps by creating pores in the cell membrane. The overall efficiency of chemotherapeutic agents might be enhanced by blocking the ABCB protein efflux pump in the CSC membrane. There is an urgent need to search for persuasive natural drugs which can target CSCs. Anti-cancer properties of Hylocereus undatus on cancer CSCs have not yet been studied. In the present study, the anti-cancer effects of the peel and flesh of H. undatus fruit on bCSCs were evaluated with the aim of developing a marketable anti-cancer nutraceutical formulation. The flesh and peel of H. undatus were freeze-dried and sequentially extracted into four different solvents (hexane, chloroform, ethyl acetate and ethanol). All extracts (eight extracts) were dried under reduced pressure, and different concentrations (12.5-400 µg/mL) were treated on bCSCs isolated from a triple-negative chemo-resistant breast cancer phenotype (MDA-MB-231 cells). Anti-proliferative effects of all extracts and paclitaxel (positive control) were determined by a colorimetric assay (WST-1 based). Since peel-chloroform (IC50= 54.8 µg/mL) and flesh-ethyl acetate (IC50= 150.5 µg/mL) extras exerted a potent anti-proliferative effect at 72 h post-incubation, a combinatorial formulation (CF) was developed with the most active peel-chloroform extract and 20 µg/mL of verapamil (a known ABCB1 drug efflux pump blocker) first time in the world. Anti-proliferative effects and pro-apoptotic effects of CF were confirmed by estimating activated caspase3 and caspase7 levels and apoptotic morphological features in the CF-treated bCSCs compared to untreated and only verapamil (20 µg/mL) treated bCSCs, and CF treated normal mammary epithelial cells (MCF-10A). The antiproliferative effects of CF (16.4 µg/mL) are greater than paclitaxel (19.2 µg/mL) and three folds greater than peel-chloroform extract (IC50= 54.8 µg/mL) on bCSCs while exerting less effects on normal cells (> 400 µg/mL). Collectively, CF can be considered as a potential initiative of a nutraceutical formulation that can target CSCs.

Keywords: breast cancer stem cells (bCSCs), Hylocereus undatus, combinatorial formulation (CF), ABCB 1 protein, verapamil

Procedia PDF Downloads 28
3845 Highly Accurate Tennis Ball Throwing Machine with Intelligent Control

Authors: Ferenc Kovács, Gábor Hosszú

Abstract:

The paper presents an advanced control system for tennis ball throwing machines to improve their accuracy according to the ball impact points. A further advantage of the system is the much easier calibration process involving the intelligent solution of the automatic adjustment of the stroking parameters according to the ball elasticity, the self-calibration, the use of the safety margin at very flat strokes and the possibility to placing the machine to any position of the half court. The system applies mathematical methods to determine the exact ball trajectories and special approximating processes to access all points on the aimed half court.

Keywords: control system, robot programming, robot control, sports equipment, throwing machine

Procedia PDF Downloads 397
3844 A Microwave and Millimeter-Wave Transmit/Receive Switch Subsystem for Communication Systems

Authors: Donghyun Lee, Cam Nguyen

Abstract:

Multi-band systems offer a great deal of benefit in modern communication and radar systems. In particular, multi-band antenna-array radar systems with their extended frequency diversity provide numerous advantages in detection, identification, locating and tracking a wide range of targets, including enhanced detection coverage, accurate target location, reduced survey time and cost, increased resolution, improved reliability and target information. An accurate calibration is a critical issue in antenna array systems. The amplitude and phase errors in multi-band and multi-polarization antenna array transceivers result in inaccurate target detection, deteriorated resolution and reduced reliability. Furthermore, the digital beam former without the RF domain phase-shifting is less immune to unfiltered interference signals, which can lead to receiver saturation in array systems. Therefore, implementing integrated front-end architecture, which can support calibration function with low insertion and filtering function from the farthest end of an array transceiver is of great interest. We report a dual K/Ka-band T/R/Calibration switch module with quasi-elliptic dual-bandpass filtering function implementing a Q-enhanced metamaterial transmission line. A unique dual-band frequency response is incorporated in the reception and calibration path of the proposed switch module utilizing the composite right/left-handed meta material transmission line coupled with a Colpitts-style negative generation circuit. The fabricated fully integrated T/R/Calibration switch module in 0.18-μm BiCMOS technology exhibits insertion loss of 4.9-12.3 dB and isolation of more than 45 dB in the reception, transmission and calibration mode of operation. In the reception and calibration mode, the dual-band frequency response centered at 24.5 and 35 GHz exhibits out-of-band rejection of more than 30 dB compared to the pass bands below 10.5 GHz and above 59.5 GHz. The rejection between the pass bands reaches more than 50 dB. In all modes of operation, the IP1-dB is between 4 and 11 dBm. Acknowledgement: This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors.

Keywords: microwaves, millimeter waves, T/R switch, wireless communications, wireless communications

Procedia PDF Downloads 160
3843 Automatic Moderation of Toxic Comments in the Face of Local Language Complexity in Senegal

Authors: Edouard Ngor Sarr, Abel Diatta, Serigne Mor Toure, Ousmane Sall, Lamine Faty

Abstract:

Thanks to Web 2, we are witnessing a form of democratization of the spoken word, an exponential increase in the number of users on the web, but also, and above all, the accumulation of a daily flow of content that is becoming, at times, uncontrollable. Added to this is the rise of a violent social fabric characterised by hateful and racial comments, insults, and other content that contravenes social rules and the platforms' terms of use. Consequently, managing and regulating this mass of new content is proving increasingly difficult, requiring substantial human, technical, and technological resources. Without regulation and with the complicity of anonymity, this toxic content can pollute discussions and make these online spaces highly conducive to abuse, which very often has serious consequences for certain internet users, ranging from anxiety to suicide, depression, or withdrawal. The toxicity of a comment is defined as anything that is rude, disrespectful, or likely to cause someone to leave a discussion or to take violent action against a person or a community. Two levels of measures are needed to deal with this deleterious situation. The first measures are being taken by governments through draft laws with a dual objective: (i) to punish the perpetrators of these abuses and (ii) to make online platforms accountable for the mistakes made by their users. The second measure comes from the platforms themselves. By assessing the content left by users, they can set up filters to block and/or delete content or decide to suspend the user in question for good. However, the speed of discussions and the volume of data involved mean that platforms are unable to properly monitor the moderation of content produced by Internet users. That's why they use human moderators, either through recruitment or outsourcing. Moderating comments on the web means assessing and monitoring users‘ comments on online platforms in order to strike the right balance between protection against abuse and users’ freedom of expression. It makes it possible to determine which publications and users are allowed to remain online and which are deleted or suspended, how authorised publications are displayed, and what actions accompany content deletions. In this study, we look at the problem of automatic moderation of toxic comments in the face of local African languages and, more specifically, on social network comments in Senegal. We review the state of the art, highlighting the different approaches, algorithms, and tools for moderating comments. We also study the issues and challenges of moderation in the face of web ecosystems with lesser-known languages, such as local languages.

Keywords: moderation, local languages, Senegal, toxic comments

Procedia PDF Downloads 2
3842 Metal Binding Phage Clones in a Quest for Heavy Metal Recovery from Water

Authors: Tomasz Łęga, Marta Sosnowska, Mirosława Panasiuk, Lilit Hovhannisyan, Beata Gromadzka, Marcin Olszewski, Sabina Zoledowska, Dawid Nidzworski

Abstract:

Toxic heavy metal ion contamination of industrial wastewater has recently become a significant environmental concern in many regions of the world. Although the majority of heavy metals are naturally occurring elements found on the earth's surface, anthropogenic activities such as mining and smelting, industrial production, and agricultural use of metals and metal-containing compounds are responsible for the majority of environmental contamination and human exposure. The permissible limits (ppm) for heavy metals in food, water and soil are frequently exceeded and considered hazardous to humans, other organisms, and the environment as a whole. Human exposure to highly nickel-polluted environments causes a variety of pathologic effects. In 2008, nickel received the shameful name of “Allergen of the Year” (GILLETTE 2008). According to the dermatologist, the frequency of nickel allergy is still growing, and it can’t be explained only by fashionable piercing and nickel devices used in medicine (like coronary stents and endoprostheses). Effective remediation methods for removing heavy metal ions from soil and water are becoming increasingly important. Among others, methods such as chemical precipitation, micro- and nanofiltration, membrane separation, conventional coagulation, electrodialysis, ion exchange, reverse and forward osmosis, photocatalysis and polymer or carbon nanocomposite absorbents have all been investigated so far. The importance of environmentally sustainable industrial production processes and the conservation of dwindling natural resources has highlighted the need for affordable, innovative biosorptive materials capable of recovering specific chemical elements from dilute aqueous solutions. The use of combinatorial phage display techniques for selecting and recognizing material-binding peptides with a selective affinity for any target, particularly inorganic materials, has gained considerable interest in the development of advanced bio- or nano-materials. However, due to the limitations of phage display libraries and the biopanning process, the accuracy of molecular recognition for inorganic materials remains a challenge. This study presents the isolation, identification and characterisation of metal binding phage clones that preferentially recover nickel.

Keywords: Heavy metal recovery, cleaning water, phage display, nickel

Procedia PDF Downloads 99
3841 Automatic Segmentation of Lung Pleura Based On Curvature Analysis

Authors: Sasidhar B., Bhaskar Rao N., Ramesh Babu D. R., Ravi Shankar M.

Abstract:

Segmentation of lung pleura is a preprocessing step in Computer-Aided Diagnosis (CAD) which helps in reducing false positives in detection of lung cancer. The existing methods fail in extraction of lung regions with the nodules at the pleura of the lungs. In this paper, a new method is proposed which segments lung regions with nodules at the pleura of the lungs based on curvature analysis and morphological operators. The proposed algorithm is tested on 06 patient’s dataset which consists of 60 images of Lung Image Database Consortium (LIDC) and the results are found to be satisfactory with 98.3% average overlap measure (AΩ).

Keywords: curvature analysis, image segmentation, morphological operators, thresholding

Procedia PDF Downloads 596
3840 Migrants in the West Immersed on Nihilism: Towards a Space for Mutual Recognition and Self-Realization

Authors: Marinete Araujo da Silva Fobister

Abstract:

This presentation aims to discuss how the feeling of ‘nostalgia’ both present on Westerns and migrants could shed light to a mutual recognition and an exchange of ways of life that could enhance mutual possibilities of self-realization. It seems that this feeling of nostalgia is related to another unfolding of the nihilism of the death of God diagnosed by Nietzsche. Westerns are feeling on the margins of the values of their own culture as they feel such values as external to them. At the same time, some groups are claiming the return of the old devalued values. In this scenario, the West is receiving many migrants from different parts of the world since the second half of the last century. Migrants might be suffering from nostalgia or homesickness for having left their home. It might be that sharing a sense of nostalgia, although with different meanings, can bring together Westerns and migrants. Migrants bring ways of life that might be unknown and inexperienced in the West, and these can shed light to new forms of interpretation and cultivation of ones’ drives, and forces and this could become a source of mutual strength cultivation. Therefore, this mutual feeling of nostalgia could lead to ways of exploring the idea of self- realization in Nietzsche detaching it from the idea of being mainly individual to a more trans-individual-cultural claim. Nietzsche argues that nihilism is a European event here translated as a Western event, which would take 200 years until it could be overcome. To overcome nihilism a new kind of human would be needed, a creative and strong kind. For Nietzsche, there is not a fixed or true self, hence one possibility for one to achieve self-realization would reside on cultivating their multiple creative forces. The argument here is that in this recent unfolding of nihilism, translated in the sense of nostalgia, the encounter between the mainstream western immersed on nihilism with migrants could create a sense of a shared temporary home, where these different ways of life could inspire each other to create new meanings. Indeed, contributing to the expansion of one’s world view, drives and forces. Therefore, fertilizing the soil for the cultivation of self-realization and consequently the creation of new values.

Keywords: migration, nihilism, nostalgia, self-realization

Procedia PDF Downloads 200
3839 Using Convolutional Neural Networks to Distinguish Different Sign Language Alphanumerics

Authors: Stephen L. Green, Alexander N. Gorban, Ivan Y. Tyukin

Abstract:

Within the past decade, using Convolutional Neural Networks (CNN)’s to create Deep Learning systems capable of translating Sign Language into text has been a breakthrough in breaking the communication barrier for deaf-mute people. Conventional research on this subject has been concerned with training the network to recognize the fingerspelling gestures of a given language and produce their corresponding alphanumerics. One of the problems with the current developing technology is that images are scarce, with little variations in the gestures being presented to the recognition program, often skewed towards single skin tones and hand sizes that makes a percentage of the population’s fingerspelling harder to detect. Along with this, current gesture detection programs are only trained on one finger spelling language despite there being one hundred and forty-two known variants so far. All of this presents a limitation for traditional exploitation for the state of current technologies such as CNN’s, due to their large number of required parameters. This work aims to present a technology that aims to resolve this issue by combining a pretrained legacy AI system for a generic object recognition task with a corrector method to uptrain the legacy network. This is a computationally efficient procedure that does not require large volumes of data even when covering a broad range of sign languages such as American Sign Language, British Sign Language and Chinese Sign Language (Pinyin). Implementing recent results on method concentration, namely the stochastic separation theorem, an AI system is supposed as an operate mapping an input present in the set of images u ∈ U to an output that exists in a set of predicted class labels q ∈ Q of the alphanumeric that q represents and the language it comes from. These inputs and outputs, along with the interval variables z ∈ Z represent the system’s current state which implies a mapping that assigns an element x ∈ ℝⁿ to the triple (u, z, q). As all xi are i.i.d vectors drawn from a product mean distribution, over a period of time the AI generates a large set of measurements xi called S that are grouped into two categories: the correct predictions M and the incorrect predictions Y. Once the network has made its predictions, a corrector can then be applied through centering S and Y by subtracting their means. The data is then regularized by applying the Kaiser rule to the resulting eigenmatrix and then whitened before being split into pairwise, positively correlated clusters. Each of these clusters produces a unique hyperplane and if any element x falls outside the region bounded by these lines then it is reported as an error. As a result of this methodology, a self-correcting recognition process is created that can identify fingerspelling from a variety of sign language and successfully identify the corresponding alphanumeric and what language the gesture originates from which no other neural network has been able to replicate.

Keywords: convolutional neural networks, deep learning, shallow correctors, sign language

Procedia PDF Downloads 100
3838 Design of a Drift Assist Control System Applied to Remote Control Car

Authors: Sheng-Tse Wu, Wu-Sung Yao

Abstract:

In this paper, a drift assist control system is proposed for remote control (RC) cars to get the perfect drift angle. A steering servo control scheme is given powerfully to assist the drift driving. A gyroscope sensor is included to detect the machine's tail sliding and to achieve a better automatic counter-steering to prevent RC car from spinning. To analysis tire traction and vehicle dynamics is used to obtain the dynamic track of RC cars. It comes with a control gain to adjust counter-steering amount according to the sensor condition. An illustrated example of 1:10 RC drift car is given and the real-time control algorithm is realized by Arduino Uno.

Keywords: drift assist control system, remote control cars, gyroscope, vehicle dynamics

Procedia PDF Downloads 397
3837 Pioneering Conservation of Aquatic Ecosystems under Australian Law

Authors: Gina M. Newton

Abstract:

Australia’s Environment Protection and Biodiversity Conservation Act (EPBC Act) is the premiere, national law under which species and 'ecological communities' (i.e., like ecosystems) can be formally recognised and 'listed' as threatened across all jurisdictions. The listing process involves assessment against a range of criteria (similar to the IUCN process) to demonstrate conservation status (i.e., vulnerable, endangered, critically endangered, etc.) based on the best available science. Over the past decade in Australia, there’s been a transition from almost solely terrestrial to the first aquatic threatened ecological community (TEC or ecosystem) listings (e.g., River Murray, Macquarie Marshes, Coastal Saltmarsh, Salt-wedge Estuaries). All constitute large areas, with some including multiple state jurisdictions. Development of these conservation and listing advices has enabled, for the first time, a more forensic analysis of three key factors across a range of aquatic and coastal ecosystems: -the contribution of invasive species to conservation status, -how to demonstrate and attribute decline in 'ecological integrity' to conservation status, and, -identification of related priority conservation actions for management. There is increasing global recognition of the disproportionate degree of biodiversity loss within aquatic ecosystems. In Australia, legislative protection at Commonwealth or State levels remains one of the strongest conservation measures. Such laws have associated compliance mechanisms for breaches to the protected status. They also trigger the need for environment impact statements during applications for major developments (which may be denied). However, not all jurisdictions have such laws in place. There remains much opposition to the listing of freshwater systems – for example, the River Murray (Australia's largest river) and Macquarie Marshes (an internationally significant wetland) were both disallowed by parliament four months after formal listing. This was mainly due to a change of government, dissent from a major industry sector, and a 'loophole' in the law. In Australia, at least in the immediate to medium-term time frames, invasive species (aliens, native pests, pathogens, etc.) appear to be the number one biotic threat to the biodiversity and ecological function and integrity of our aquatic ecosystems. Consequently, this should be considered a current priority for research, conservation, and management actions. Another key outcome from this analysis was the recognition that drawing together multiple lines of evidence to form a 'conservation narrative' is a more useful approach to assigning conservation status. This also helps to addresses a glaring gap in long-term ecological data sets in Australia, which often precludes a more empirical data-driven approach. An important lesson also emerged – the recognition that while conservation must be underpinned by the best available scientific evidence, it remains a 'social and policy' goal rather than a 'scientific' goal. Communication, engagement, and 'politics' necessarily play a significant role in achieving conservation goals and need to be managed and resourced accordingly.

Keywords: aquatic ecosystem conservation, conservation law, ecological integrity, invasive species

Procedia PDF Downloads 132
3836 Community Engagement Policy for Decreasing Childhood Lead Poisoning in Philadelphia

Authors: Hasibe Caballero-Gomez, Richard Pepino

Abstract:

Childhood lead poisoning is an issue that continues to plague major U.S. cities. Lead poisoning has been linked to decreases in academic achievement and IQ at levels as low as 5 ug/dL. Despite efforts from the Philadelphia Health Department to curtail systemic childhood lead poisoning, children continue to be identified with elevated blood lead levels (EBLLs) above the CDC reference level for diagnosis. This problem disproportionately affects low-income Black communities. At the moment, remediation is costly, and with the current policies in place, comprehensive remediation seems unrealistic. This research investigates community engagement policy and the ways pre-exisiting resources in target communities can be adjusted to decrease childhood lead poisoning. The study was done with two methods: content analysis and case studies. The content analysis includes 12 interviews from stakeholders and five published policy recommendations. The case studies focus on Baltimore, Chicago, Rochester, and St. Louis, four cities with significant childhood lead poisoning. Target communities were identified by mapping five factors that indicate a higher risk for lead poisoning. Seven priority zipcodes were identified for the model developed in this study. For these urban centers, 28 policy solutions and suggestions were identified, with three being identified at least four times in the content analysis and case studies. These three solutions create an interdependent model that offers increased community awareness and engagement with the issue that could potentially improve health and social outcomes for at-risk children.

Keywords: at-risk populations, community engagement, environmental justice, policy translation

Procedia PDF Downloads 120
3835 Toehold Mediated Shape Transition of Nucleic Acid Nanoparticles

Authors: Emil F. Khisamutdinov

Abstract:

Development of functional materials undergoing structural transformations in response to an external stimulus such as environmental changes (pH, temperature, etc.), the presence of particular proteins, or short oligonucleotides are of great interest for a variety of applications ranging from medicine to electronics. The dynamic operations of most nucleic acid (NA) devices, including circuits, nano-machines, and biosensors, rely on networks of NA strand displacement processes in which an external or stimulus strand displaces a target strand from a DNA or RNA duplex. The rate of strand displacement can be greatly increased by the use of “toeholds,” single-stranded regions of the target complex to which the invading strand can bind to initiate the reaction, forming additional base pairs that provide a thermodynamic driving force for transformation. Herein, we developed a highly robust nanoparticle shape transition, sequentially transforming DNA polygons from one shape to another using the toehold-mediated DNA strand displacement technique. The shape transformation was confirmed by agarose gel electrophoresis and atomic force microscopy. Furthermore, we demonstrate that our approach is applicable for RNA shape transformation from triangle to square, which can be detected by fluorescence emission from malachite green binding RNA aptamer. Using gel-shift and fluorescence assays, we demonstrated efficient transformation occurs at isothermal conditions (37°C) that can be implemented within living cells as reporter molecules. This work is intended to provide a simple, cost-effective, and straightforward model for the development of biosensors and regulatory devices in nucleic acid nanotechnology.

Keywords: RNA nanotechnology, bionanotechnology, toehold mediated DNA switch, RNA split fluorogenic aptamers

Procedia PDF Downloads 80
3834 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases

Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García

Abstract:

This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.

Keywords: conceptual modelling, JSON, NoSQL databases, requirements engineering, software development

Procedia PDF Downloads 378
3833 Label Free Detection of Small Molecules Using Surface-Enhanced Raman Spectroscopy with Gold Nanoparticles Synthesized with Various Capping Agents

Authors: Zahra Khan

Abstract:

Surface-Enhanced Raman Spectroscopy (SERS) has received increased attention in recent years, focusing on biological and medical applications due to its great sensitivity as well as molecular specificity. In the context of biological samples, there are generally two methodologies for SERS based applications: label-free detection and the use of SERS tags. The necessity of tagging can make the process slower and limits the use for real life. Label-free detection offers the advantage that it reports direct spectroscopic evidence associated with the target molecule rather than the label. Reproducible, highly monodisperse gold nanoparticles (Au NPs) were synthesized using a relatively facile seed-mediated growth method. Different capping agents (TRIS, citrate, and CTAB) were used during synthesis, and characterization was performed. They were then mixed with different analyte solutions before drop-casting onto a glass slide prior to Raman measurements to see which NPs displayed the highest SERS activity as well as their stability. A host of different analytes were tested, both non-biomolecules and biomolecules, which were all successfully detected using this method at concentrations as low as 10-3M with salicylic acid reaching a detection limit in the nanomolar range. SERS was also performed on samples with a mixture of analytes present, whereby peaks from both target molecules were distinctly observed. This is a fast and effective rapid way of testing samples and offers potential applications in the biomedical field as a tool for diagnostic and treatment purposes.

Keywords: gold nanoparticles, label free, seed-mediated growth, SERS

Procedia PDF Downloads 125
3832 Optimizing Campaign Effectiveness: Identifying Target Customers via Recommender Engine

Authors: Nikita Katyal, Shubham Jain

Abstract:

In today’s competitive business environment, the success of campaigns relies not only on their creation but also on effectively reaching the right customers. Campaigns often feature products that customers may not have considered or are unaware of, including popular items. This research aims to enhance retailer sales by leveraging an efficient recommender system that reminds targeted customers to purchase their preferred products and suggests additional items they hadn’t initially considered during a campaign. Our focus is on utilizing the recommender system to identify potential customers for a curated set of products selected by the marketing team for a specific campaign. Communicating with all customers can be time-consuming and costly, and irrelevant messages may harm customer loyalty. Therefore, the primary objective is to strategically select the right customers for a campaign, increasing sales and reducing communication costs. This paper provides valuable insights into connecting with the right customer segments to optimize revenue generation for businesses. The analysis shows that high-value customers (those generating the highest revenue) contributed to increases in average basket size, while win-back customers (with low engagement) and about to churn customers (those at risk of attrition) improved the effectiveness of marketing contacts by increasing engagement and reducing churn. Targeted communication, focused on revenue, also enhanced the quality of the relationship between the customer and the firm, helping to lower churn rates by engaging customers with suitable campaigns. This research provides empirical evidence supporting the theoretical benefits of targeting the right customers for a campaign.

Keywords: recommendation, ALS, marketing campaigns, target customers, churn

Procedia PDF Downloads 7
3831 Effects of Channel Orientation on Heat Transfer in a Rotating Rectangular Channel with Jet Impingement Cooling and Film Coolant Extraction

Authors: Hua Li, Hongwu Deng

Abstract:

The turbine blade's leading edge is usually cooled by jet impingement cooling technology due to the heaviest heat load. For a rotating turbine blade, however, the channel orientation (β, the angle between the jet direction and the rotating plane) could play an important role in influencing the flow field and heat transfer. Therefore, in this work, the effects of channel orientation (from 90° to 180°) on heat transfer in a jet impingement cooling channel are experimentally investigated. Furthermore, the investigations are conducted under an isothermal boundary condition. Both the jet-to-target surface distance and jet-to-jet spacing are three times the jet hole diameter. The jet Reynolds number is 5,000, and the maximum jet rotation number reaches 0.24. The results show that the rotation-induced variations of heat transfer are different in each channel orientation. In the cases of 90°≤β≤135°, a vortex generated in the low-radius region of the supply channel changes the mass-flowrate distribution in each jet hole. Therefore, the heat transfer in the low-radius region decreases with the rotation number, whereas the heat transfer in the high-radius region increases, indicating that a larger temperature gradient in the radial direction could appear in the turbine blade's leading edge. When 135°<β≤180°; however, the heat transfer of the entire stagnant zone decreases with the rotation number. The rotation-induced jet deflection is the primary factor that weakens the heat transfer, and jets cannot reach the target surface at high rotation numbers. For the downstream regions, however, the heat transfer is enhanced by 50%-80% in every channel orientation because the dead zone is broken by the rotation-induced secondary flow in the impingement channel.

Keywords: heat transfer, jet impingement cooling, channel orientation, high rotation number, isothermal boundary

Procedia PDF Downloads 105
3830 Interactive, Topic-Oriented Search Support by a Centroid-Based Text Categorisation

Authors: Mario Kubek, Herwig Unger

Abstract:

Centroid terms are single words that semantically and topically characterise text documents and so may serve as their very compact representation in automatic text processing. In the present paper, centroids are used to measure the relevance of text documents with respect to a given search query. Thus, a new graphbased paradigm for searching texts in large corpora is proposed and evaluated against keyword-based methods. The first, promising experimental results demonstrate the usefulness of the centroid-based search procedure. It is shown that especially the routing of search queries in interactive and decentralised search systems can be greatly improved by applying this approach. A detailed discussion on further fields of its application completes this contribution.

Keywords: search algorithm, centroid, query, keyword, co-occurrence, categorisation

Procedia PDF Downloads 282
3829 Analysis of the Best Interest of the Child Principle within a Marriage Law Framework: A Study of South Africa

Authors: Lizelle Ramaccio Calvino

Abstract:

Article 3 of the United Nations Convention on the Rights of Child states that 'The best interests of the child must be a top priority in all decisions and actions that affect children.' This stance is also echoed in terms of article 20 of the African Charter on the Rights and Welfare of the Child. South Africa, as a signatory of the aforesaid international and national conventions, constitutionalised the best interest of the child in terms of section 28(2) of the Republic of South Africa, 1996. Section 28(2) provides that '[A] child’s best interests are of paramount importance in every matter concerning the child.' The application of 'the best interests of the child' principle is consequently applicable in all fields of South African law, including matrimonial law. Two separate but equal Acts regulate civil marriages in South Africa, namely the Marriage Act 25 of 1961 and the Civil Union Act 17 of 2006. Customary marriages are regulated by the Recognition of Customary Marriages Act 120 of 1998. In terms of the Marriage Act and the Recognition of Customary Marriages Act, a minor may (provided he/she obtains the required consent) enter into a marriage. Despite the aforesaid, section 1 of the Civil Union Act categorically prohibits a minor from entering into a civil union. The article will first determine whether the ban of minors from entering into a civil union undermines the 'the best interests of the child' principle, and if so, whether it is in violation of the Constitution as well as international and national conventions. In addition, the article will critically analyse whether the application of the Marriage Act and the Civil Union Act (dual Acts) result in disparity within the South African marriage law framework, and if so, whether such discrepancy violates same-sex couples’ right (in particular a same-sex minor) to equality before the law and to have their dignity protected. The article intends, through the application of a qualitative research methodology and by way of a comparative analyses of international and domestic laws, consider whether a single well-defined structure such as the Dutch marriage law system would not be an improved alternative to address the existing paradox resulting from the application of an Act that undermines 'the best interest of the child' principle. Ultimately the article proposes recommendations for matrimonial law reform.

Keywords: best interests of the child, civil marriage, civil union, minor

Procedia PDF Downloads 176
3828 Grain Boundary Detection Based on Superpixel Merges

Authors: Gaokai Liu

Abstract:

The distribution of material grain sizes reflects the strength, fracture, corrosion and other properties, and the grain size can be acquired via the grain boundary. In recent years, the automatic grain boundary detection is widely required instead of complex experimental operations. In this paper, an effective solution is applied to acquire the grain boundary of material images. First, the initial superpixel segmentation result is obtained via a superpixel approach. Then, a region merging method is employed to merge adjacent regions based on certain similarity criterions, the experimental results show that the merging strategy improves the superpixel segmentation result on material datasets.

Keywords: grain boundary detection, image segmentation, material images, region merging

Procedia PDF Downloads 170
3827 FDX1, a Cuproptosis-Related Gene, Identified as a Potential Target for Human Ovarian Aging

Authors: Li-Te Lin, Chia-Jung Li, Kuan-Hao Tsui

Abstract:

Cuproptosis, a newly identified cell death mechanism, has attracted attention for its association with various diseases. However, the genetic interplay between cuproptosis and ovarian aging remains largely unexplored. This study aims to address this gap by analyzing datasets related to ovarian aging and cuproptosis. Spatial transcriptome analyses were conducted in the ovaries of both young and aged female mice to elucidate the role of FDX1. Comprehensive bioinformatics analyses, facilitated by R software, identified FDX1 as a potential cuproptosis-related gene with implications for ovarian aging. Clinical infertility biopsies were examined to validate these findings, showing consistent results in elderly infertile patients. Furthermore, pharmacogenomic analyses of ovarian cell lines explored the intricate association between FDX1 expression levels and sensitivity to specific small molecule drugs. Spatial transcriptome analyses revealed a significant reduction in FDX1 expression in aging ovaries, supported by consistent findings in biopsies from elderly infertile patients. Pharmacogenomic investigations indicated that modulating FDX1 could influence drug responses in ovarian-related therapies. This study pioneers the identification of FDX1 as a cuproptosis-related gene linked to ovarian aging. These findings not only contribute to understanding the mechanisms of ovarian aging but also position FDX1 as a potential diagnostic biomarker and therapeutic target. Further research may establish FDX1's pivotal role in advancing precision medicine and therapies for ovarian-related conditions.

Keywords: cuproptosis, FDX1, ovarian aging, biomarker

Procedia PDF Downloads 41
3826 Preparation of Polymer-Stabilized Magnetic Iron Oxide as Selective Drug Nanocarriers to Human Acute Myeloid Leukemia

Authors: Kheireddine El-Boubbou

Abstract:

Drug delivery to target human acute myeloid leukemia (AML) using a nanoparticulate chemotherapeutic formulation that can deliver drugs selectively to AML cancer is hugely needed. In this work, we report the development of a nanoformulation made of polymeric-stabilized multifunctional magnetic iron oxide nanoparticles (PMNP) loaded with the anticancer drug Doxorubicin (Dox) as a promising drug carrier to treat AML. Dox@PMNP conjugates simultaneously exhibited high drug content, maximized fluorescence, and excellent release properties. Nanoparticulate uptake and cell death following addition of Dox@PMNPs were then evaluated in different types of human AML target cells, as well as on normal human cells. While the unloaded MNPs were not toxic to any of the cells, Dox@PMNPs were found to be highly toxic to the different AML cell lines, albeit at different inhibitory concentrations (IC50 values), but showed very little toxicity towards the normal cells. In comparison, free Dox showed significant potency concurrently to all the cell lines, suggesting huge potentials for the use of Dox@PMNPs as selective AML anticancer cargos. Live confocal imaging, fluorescence and electron microscopy confirmed that Dox is indeed delivered to the nucleus in relatively short periods of time, causing apoptotic cell death. Importantly, this targeted payload may potentially enhance the effectiveness of the drug in AML patients and may further allow physicians to image leukemic cells exposed to Dox@PMNPs using MRI.

Keywords: magnetic nanoparticles, drug delivery, acute myeloid leukemia, iron oxide, cancer nanotherapy

Procedia PDF Downloads 230
3825 Assessing Acceptability and Preference of Printed Posters on COVID-19 Related Stigma: A Post-Test Study Among HIV-Focused Health Workers in Greater Accra Region of Ghana

Authors: Jerry Fiave, Dacosta Aboagye, Stephen Ayisi-Addo, Mabel Kissiwah Asafo, Felix Osei-Sarpong, Ebenezer Kye-Mensah, Renee Opare-Otoo

Abstract:

Background: Acceptability and preference of social and behaviour change (SBC) materials by target audiences is an important determinant of effective health communication outcomes. In Ghana, however, pre-test and post-test studies on acceptability and preference of specific SBC materials for specific audiences are rare. The aim of this study was therefore to assess the acceptability and preference of printed posters on COVID-19 related stigma as suitable SBC materials for health workers to influence behaviours that promote uptake of HIV-focused services. Methods: A total of 218 health workers who provide HIV-focused services were purposively sampled in 16 polyclinics where the posters were distributed in the Greater Accra region of Ghana. Data was collected in March 2021 using an adapted self-administered questionnaire in Google forms deployed via WhatsApp to participants. The data were imported into SPSS version 27 where chi-square test and regression analyses were performed to establish association as well as strength of association between variables respectively. Results: A total of 142 participants (physicians, nurses, midwives, lab scientists, health promoters, diseases control officers) made up of 85(60%) females and 57(40%) males responded to the questionnaire, giving a response rate of 65.14%. Only 88 (61.97%) of the respondents were exposed to the posters. The majority of those exposed said the posters were informative [82(93.18%)], relevant [85(96.59%)] and attractive [83(94.32%)]. They [82(93.20%)] also rated the material as acceptable with no statistically significant association between category of health worker and acceptability of the posters (X =1.631, df=5, p=0.898). However, participants’ most preferred forms of material on COVID-19 related stigma were social media [38(26.76%)], television [33(23.24%)], SMS [19(13.38%)], and radio [18(12.70%)]. Clinical health workers were 4.88 times more likely to prefer online or electronic versions of SBC materials than nonclinical health workers [AOR= 4.88 (95% CI= 0.31-0.98), p=0.034]. Conclusions: Printed posters on COVID-19 related stigma are acceptable SBC materials in communicating behaviour change messages that target health workers in promoting uptake of HIV-focused services. Posters are however, not among the most preferred materials for health workers. It is therefore recommended that material assessment studies are conducted to inform the development of acceptable and preferred materials for target audiences.

Keywords: acceptability, AIDS, HIV, posters, preference, SBC, stigma, social and behaviour change communication

Procedia PDF Downloads 103
3824 Estimating Destinations of Bus Passengers Using Smart Card Data

Authors: Hasik Lee, Seung-Young Kho

Abstract:

Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.

Keywords: destination estimation, Kernel density estimation, smart card data, validation

Procedia PDF Downloads 352
3823 Investigating the Acquisition of English Emotion Terms by Moroccan EFL Learners

Authors: Khalid El Asri

Abstract:

Culture influences lexicalization of salient concepts in a society. Hence, languages often have different degrees of equivalence regarding lexical items of different fields. The present study focuses on the field of emotions in English and Moroccan Arabic. Findings of a comparative study that involved fifty English emotions revealed that Moroccan Arabic has equivalence of some English emotion terms, partial equivalence of some emotion terms, and no equivalence for some other terms. It is hypothesized then that emotion terms that have near equivalence in Moroccan Arabic will be easier to acquire for EFL learners, while partially equivalent terms will be difficult to acquire, and those that have no equivalence will be even more difficult to acquire. In order to test these hypotheses, the participants (104 advanced Moroccan EFL learners and 104 native speakers of English) were given two tests: the first is a receptive one in which the participants were asked to choose, among four emotion terms, the term that is appropriate to fill in the blanks for a given situation indicating certain kind of feelings. The second test is a productive one in which the participants were asked to give the emotion term that best described the feelings of the people in the situations given. The results showed that conceptually equivalent terms do not pose any problems for Moroccan EFL learners since they can link the concept to an already existing linguistic category; whereas the results concerning the acquisition of partially equivalent terms indicated that this type of emotion terms were difficult for Moroccan EFL learners to acquire, because they need to restructure the boundaries of the target linguistic categories by expanding them when the term includes other range of meanings that are not subsumed in the L1 term. Surprisingly however, the results concerning the case of non-equivalence revealed that Moroccan EFL learners could internalize the target L2 concepts that have no equivalence in their L1. Thus, it is the category of emotion terms that have partial equivalence in the learners’ L1 that pose problems for them.

Keywords: acquisition, culture, emotion terms, lexical equivalence

Procedia PDF Downloads 227
3822 Assessment of Image Databases Used for Human Skin Detection Methods

Authors: Saleh Alshehri

Abstract:

Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases.

Keywords: image databases, image processing, pattern recognition, neural networks

Procedia PDF Downloads 271
3821 Network Pharmacological Evaluation of Holy Basil Bioactive Phytochemicals for Identifying Novel Potential Inhibitors Against Neurodegenerative Disorder

Authors: Bhuvanesh Baniya

Abstract:

Alzheimer disease is illnesses that are responsible for neuronal cell death and resulting in lifelong cognitive problems. Due to their unclear mechanism, there are no effective drugs available for the treatment. For a long time, herbal drugs have been used as a role model in the field of the drug discovery process. Holy basil in the Indian medicinal system (Ayurveda) is used for several neuronal disorders like insomnia and memory loss for decades. This study aims to identify active components of holy basil as potential inhibitors for the treatment of Alzheimer disease. To fulfill this objective, the Network pharmacology approach, gene ontology, pharmacokinetics analysis, molecular docking, and molecular dynamics simulation (MDS) studies were performed. A total of 7 active components in holy basil, 12 predicted neurodegenerative targets of holy basil, and 8063 Alzheimer-related targets were identified from different databases. The network analysis showed that the top ten targets APP, EGFR, MAPK1, ESR1, HSPA4, PRKCD, MAPK3, ABL1, JUN, and GSK3B were found as significant target related to Alzheimer disease. On the basis of gene ontology and topology analysis results, APP was found as a significant target related to Alzheimer’s disease pathways. Further, the molecular docking results to found that various compounds showed the best binding affinities. Further, MDS top results suggested could be used as potential inhibitors against APP protein and could be useful for the treatment of Alzheimer’s disease.

Keywords: holy basil, network pharmacology, neurodegeneration, active phytochemicals, molecular docking and simulation

Procedia PDF Downloads 101
3820 Comparative Study of Skeletonization and Radial Distance Methods for Automated Finger Enumeration

Authors: Mohammad Hossain Mohammadi, Saif Al Ameri, Sana Ziaei, Jinane Mounsef

Abstract:

Automated enumeration of the number of hand fingers is widely used in several motion gaming and distance control applications, and is discussed in several published papers as a starting block for hand recognition systems. The automated finger enumeration technique should not only be accurate, but also must have a fast response for a moving-picture input. The high performance of video in motion games or distance control will inhibit the program’s overall speed, for image processing software such as Matlab need to produce results at high computation speeds. Since an automated finger enumeration with minimum error and processing time is desired, a comparative study between two finger enumeration techniques is presented and analyzed in this paper. In the pre-processing stage, various image processing functions were applied on a real-time video input to obtain the final cleaned auto-cropped image of the hand to be used for the two techniques. The first technique uses the known morphological tool of skeletonization to count the number of skeleton’s endpoints for fingers. The second technique uses a radial distance method to enumerate the number of fingers in order to obtain a one dimensional hand representation. For both discussed methods, the different steps of the algorithms are explained. Then, a comparative study analyzes the accuracy and speed of both techniques. Through experimental testing in different background conditions, it was observed that the radial distance method was more accurate and responsive to a real-time video input compared to the skeletonization method. All test results were generated in Matlab and were based on displaying a human hand for three different orientations on top of a plain color background. Finally, the limitations surrounding the enumeration techniques are presented.

Keywords: comparative study, hand recognition, fingertip detection, skeletonization, radial distance, Matlab

Procedia PDF Downloads 382
3819 Integration of GIS with Remote Sensing and GPS for Disaster Mitigation

Authors: Sikander Nawaz Khan

Abstract:

Natural disasters like flood, earthquake, cyclone, volcanic eruption and others are causing immense losses to the property and lives every year. Current status and actual loss information of natural hazards can be determined and also prediction for next probable disasters can be made using different remote sensing and mapping technologies. Global Positioning System (GPS) calculates the exact position of damage. It can also communicate with wireless sensor nodes embedded in potentially dangerous places. GPS provide precise and accurate locations and other related information like speed, track, direction and distance of target object to emergency responders. Remote Sensing facilitates to map damages without having physical contact with target area. Now with the addition of more remote sensing satellites and other advancements, early warning system is used very efficiently. Remote sensing is being used both at local and global scale. High Resolution Satellite Imagery (HRSI), airborne remote sensing and space-borne remote sensing is playing vital role in disaster management. Early on Geographic Information System (GIS) was used to collect, arrange, and map the spatial information but now it has capability to analyze spatial data. This analytical ability of GIS is the main cause of its adaption by different emergency services providers like police and ambulance service. Full potential of these so called 3S technologies cannot be used in alone. Integration of GPS and other remote sensing techniques with GIS has pointed new horizons in modeling of earth science activities. Many remote sensing cases including Asian Ocean Tsunami in 2004, Mount Mangart landslides and Pakistan-India earthquake in 2005 are described in this paper.

Keywords: disaster mitigation, GIS, GPS, remote sensing

Procedia PDF Downloads 481