Lattice swelling and modulus change in a helium-implanted tungsten alloy: X-ray micro-diffraction, surface acoustic wave measurements, and multiscale modelling
Using X-ray micro-diffraction and surface acoustic wave spectroscopy, we measure lattice swelling and elastic modulus changes in a W-1%Re alloy after implantation with 3110 appm of helium. A fraction of a percent observed lattice expansion gives rise to an order of magnitude larger reduction in the surface acoustic wave velocity. A multiscale elasticity, molecular dynamics, and density functional theory model is applied to the interpretation of observations. The measured lattice swelling is consistent with the relaxation volume of self-interstitial and helium-filled vacancy defects that dominate the helium-implanted material microstructure. Molecular dynamics simulations confirm the elasticity model for swelling. Elastic properties of the implanted surface layer also change due to defects. The reduction of surface acoustic wave velocity predicted by density functional theory calculations agrees remarkably well with experimental observations.
Tungsten is the main candidate material for plasma-facing armour components in future fusion reactors. Bombardment with energetic fusion neutrons causes collision cascade damage and defect formation. Interaction of defects with helium, produced by transmutation and injected from the plasma, modifies defect retention and behaviour. Here we investigate the residual lattice strains caused by different doses of helium-ion-implantation into tungsten and tungsten-rhenium alloys. Energy and depth-resolved synchrotron X-ray micro-diffraction uniquely permits the measurement of lattice strain with sub-micron 3D spatial resolution and ~10-4 strain sensitivity. Increase of helium dose from 300 appm to 3000 appm increases volumetric strain by only ~2.4 times, indicating that defect retention per injected helium atom is ~3 times higher at low helium doses. This suggests that defect retention is not a simple function of implanted helium dose, but strongly depends on material composition and presence of impurities. Conversely, analysis of W-1wt% Re alloy samples and of different crystal orientations shows that both the presence of rhenium, and crystal orientation, have comparatively small effect on defect retention. These insights are key for the design of armour components in future reactors where it will be essential to account for irradiation-induced dimensional change when predicting component lifetime and performance.
Tungsten is the main candidate material for plasma-facing armour components in future fusion reactors. In-service, fusion neutron irradiation creates lattice defects through collision cascades. Helium, injected from plasma, aggravates damage by increasing defect retention. Both can be mimicked using helium-ion-implantation. In a recent study on 3000 appm helium-implanted tungsten (W-3000He), we hypothesized helium-induced irradiation hardening, followed by softening during deformation. The hypothesis was founded on observations of large increase in hardness, substantial pile-up and slip-step formation around nano-indents and Laue diffraction measurements of localised deformation underlying indents. Here we test this hypothesis by implementing it in a crystal plasticity finite element (CPFE) formulation, simulating nano-indentation in W-3000He at 300 K. The model considers thermally-activated dislocation glide through helium-defect obstacles, whose barrier strength is derived as a function of defect concentration and morphology. Only one fitting parameter is used for the simulated helium-implanted tungsten; defect removal rate. The simulation captures the localised large pile-up remarkably well and predicts confined fields of lattice distortions and geometrically necessary dislocation underlying indents which agree quantitatively with previous Laue measurements. Strain localisation is further confirmed through high resolution electron backscatter diffraction and transmission electron microscopy measurements on cross-section lift-outs from centre of nano-indents in W-3000He.
We present experimental results and numerical Finite Element analysis to describe surface swelling due to the creation of buried graphite-like inclusions in diamond substrates subjected to MeV ion implantation. Numerical predictions are compared to experimental data for MeV proton and helium implantations, performed with scanning ion microbeams. Swelling values are measured with white light interferometric profilometry in both cases. Simulations are based on a model which accounts for the through-the-thickness variation of mechanical parameters in the material, as a function of ion type, fluence and energy. Surface deformation profiles and internal stress distributions are analyzed and numerical results are seen to adequately fit experimental data. Results allow us to draw conclusions on structural damage mechanisms in diamond for different MeV ion implantations.
Developing a comprehensive understanding of the modification of material properties by neutron irradiation is important for the design of future fission and fusion power reactors. Self-ion implantation is commonly used to mimic neutron irradiation damage, however an interesting question concerns the effect of ion energy on the resulting damage structures. The reduction in the thickness of the implanted layer as the implantation energy is reduced results in the significant quandary: Does one attempt to match the primary knock-on atom energy produced during neutron irradiation or implant at a much higher energy, such that a thicker damage layer is produced? Here we address this question by measuring the full strain tensor for two ion implantation energies, 2 MeV and 20 MeV in self-ion implanted tungsten, a critical material for the first wall and divertor of fusion reactors. A comparison of 2 MeV and 20 MeV implanted samples is shown to result in similar lattice swelling. Multi-reflection Bragg coherent diffractive imaging (MBCDI) shows that implantation induced strain is in fact heterogeneous at the nanoscale, suggesting that there is a non-uniform distribution of defects, an observation that is not fully captured by micro-beam Laue diffraction. At the surface, MBCDI and high-resolution electron back-scattered diffraction (HR-EBSD) strain measurements agree quite well in terms of this clustering/non-uniformity of the strain distribution. However, MBCDI reveals that the heterogeneity at greater depths in the sample is much larger than at the surface. This combination of techniques provides a powerful method for detailed investigation of the microstructural damage caused by ion bombardment, and more generally of strain related phenomena in microvolumes that are inaccessible via any other technique.
Micro-Laue diffraction and simultaneous rainbow-filtered micro-diffraction were used to measure accurately the full strain tensor and the lattice orientation distribution at the sub-micron scale in highly strained, suspended Ge micro-devices. A numerical approach to obtain the full strain tensor from the deviatoric strain measurement alone is also demonstrated and used for faster full strain mapping. We performed the measurements in a series of micro-devices under either uniaxial or biaxial stress and found an excellent agreement with numerical simulations. This shows the superior potential of Laue micro-diffraction for the investigation of highly strained micro-devices.