1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
|
/*! \page page_rdtesting R&D Testing Procedures
All defined R&D test procedures are listed here. These tests are meant as a tool
for Ettus R&D to enable faster and more reliable development. Note these tests
are no replacement for manufacturing or production tests, and should not be
treated as such. Instead, they are meant to catch common failure modes during
development. As a result, test definitions are fairly light-weight.
\section rdtesting_phase Phase Alignment Tests
tbd
\section rdtesting_gpsdo GPSDO Tests
| Test Code | Device | Peripherals | Manual Test Procedure | Automatic Test Procedure |
|------------------|-----------|-------------------|------------------------------|---------------------------|
| GPS-X310-TCXO-v1 | USRP X310 | Jackson Labs TCXO | \ref rdtesting_gpsdo_manual | \ref rdtesting_gpsdo_auto |
| GPS-X310-OCXO-v1 | USRP X310 | Jackson Labs OCXO | \ref rdtesting_gpsdo_manual | \ref rdtesting_gpsdo_auto |
| GPS-X300-TCXO-v1 | USRP X300 | Jackson Labs TCXO | \ref rdtesting_gpsdo_manual | \ref rdtesting_gpsdo_auto |
| GPS-X300-OCXO-v1 | USRP X300 | Jackson Labs OCXO | \ref rdtesting_gpsdo_manual | \ref rdtesting_gpsdo_auto |
| GPS-B200-TCXO-v1 | USRP B200 | Jackson Labs TCXO | \ref rdtesting_gpsdo_manual | \ref rdtesting_gpsdo_auto |
| GPS-B210-TCXO-v1 | USRP B210 | Jackson Labs TCXO | \ref rdtesting_gpsdo_manual | \ref rdtesting_gpsdo_auto |
\subsection rdtesting_gpsdo_recommendations Recommendations
For cursory testing, not all tests within a device family are required (e.g.,
only testing the OCXO on any X-Series, and testing the TCXO on any B-Series is
sufficient).
The following tests are recommended for a minimum test (N stands for the latest
version of this test):
- One of GPS-X310-OCXO-vN or GPS-X300-OCXO-vN
- One of GPS-B210-TCXO-vN or GPS-B200-TCXO-vN
\subsection rdtesting_gpsdo_requirements Requirements
All of these tests require a device that is GPSDO capable (e.g., X3x0, B2x0,
N2x0). For those devices that have a separate GPS component (such as the Jackson
Labs GPSDOs), this component is also required (called the "peripheral" in the
following).
\subsection rdtesting_gpsdo_manual GPSDO: Manual Test Procedure
1. Without connecting the peripheral to the device, run `uhd_usrp_probe` on the
device and verify that the lack of GPSDO is correctly reported. No warning
or error must be printed.
2. This and the following tests are run with the peripheral connected: Run
`uhd_usrp_probe` and verify that the GPSDO is correctly reported. Power down
the device before connecting the peripheral. The GPSDO must be reported
found, and no error or warning must be printed.
3. Without connecting the GPS antenna input, run `query_gpsdo_sensors`. To pass,
it must report the GPSDO as found, lock to the external reference, but then
report not being locked to GPS. The tool will report a valid GPS time, and
a string such as "GPS and UHD Device time are aligned" in case of success.
4. Connect a GPS antenna to the input and make sure it is in a position to
receive GPS satellite data. Confirm that GPS lock is reported using
`query_gpsdo_sensors` within 20 minutes of connecting the antenna.
The tool `query_gpsdo_sensors` will print a string such as "GPS Locked" in
case of success.
All of these tests must pass for a 'pass' validation.
\subsection rdtesting_gpsdo_auto GPSDO: Automatic Test Procedure
tbd
\section rdtesting_devtest Devtests
| Test Code | Device | Peripherals | Manual Test Procedure | Automatic Test Procedure |
|---------------------|---------------|-------------|-------------------------------|-----------------------------|
| DEVTEST-X310-XG-v1 | USRP X310 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-X310-HG-v1 | USRP X310 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-X300-XG-v1 | USRP X300 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-X300-HG-v1 | USRP X300 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-E310-SG1-v1 | USRP E310-SG1 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-E310-SG3-v1 | USRP E310-SG3 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-B200-v1 | USRP B200 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-B210-v1 | USRP B210 | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-B200m-v1 | USRP B200mini | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
| DEVTEST-B205m-v1 | USRP B205mini | None | \ref rdtesting_devtest_manual | \ref rdtesting_devtest_auto |
The devtests are hardware tests built in to the UHD make system. They can be run
directly from the build directory and require no configuration.
Devtests are designed to always run, regardless of the actual device
configuration. This means, by definition, that devtests cannot require special
cabling, specific daughtercards, etc.
Note: The actual devtests can change, since they're part of the code. This does
not require a version bump on the test code.
\subsection rdtesting_devtest_requirements Requirements
Devtests are only defined for some devices. When running a devtest, all
peripherals must be disconnected (e.g., no daughterboards on the X-Series, no
GPSDOs on the B- and X-Series).
\subsection rdtesting_devtest_manual Devtest: Manual Test Procedure
### X3x0 procedure
1. Make sure no peripherals are connected to the device (no daughterboards, no
GSPDO, front panel GPIO is unconnected).
2. When testing the HG image, run a test once for each connection (1 GigE and
10 GigE). When testing the XG image, a test on either connection (SFP0 or
SFP1) is sufficient. In both cases, also test via PCIe.
3. When the device is connected, simply run `make test_x3x0` from the command
line in the build directory. Multiple devices connected will all get tested,
there is no requirement to only connect a single device at a time (because
devtest will run sequentially anyway).
4. Devtest must report no failures for a 'pass' validation.
### B2xx procedure
Note: The test codes with an 'm' suffix refer to B200mini and B205mini,
respectively.
1. Make sure no peripherals are connected to the device (no GPSDO if applicable,
GPIO pins unconnected)
2. Test once via USB3, once via USB2.
3. Simply run `make test_b2xx`
4. Devtest must report no failures for a 'pass' validation.
### E310 procedure
1. Make sure GPIO pins are unconnected.
2. Tests need to be run natively on the device. If the build environment is
available on the device, running `make test_e3xx` is sufficient.
3. In general, there is no build environment on the device (e.g. when doing a
typical sshfs mount of an environment). In this case, copy the contents of
the devtest directory onto the device, and run the following command (the
environment variables need to point to the location of the devtest code, the
location of the UHD examples such as benchmark_rate, and where you want to
store log files, respectively):
$DEVTEST_DIR/run_testsuite.py --src-dir $DEVTEST_DIR \
--devtest-pattern e3xx \
--build-type na \
--build-dir $EXAMPLES_DIR \
--device-filter e3x0 \
--log-dir $LOG_DIR
4. Devtest must report no failures for a 'pass' validation.
\subsection rdtesting_devtest_auto Devtest: Automatic Test Procedure
As all these tests can be run unsupervised, they can be run automatically given
the correct device setup. The return code of the tests can be used to check for
pass/fail conditions (return code 0 means 'pass').
\section rdtesting_fpga_testbenches FPGA: Testing through Simulations
Test benches provide a faster way to verify the design through simulations.
| Test Code | Device | Peripherals | Manual Test Procedure | Automatic Test Procedure |
|------------------|-----------|-------------------|------------------------------|---------------------------|
| FPGATB-v1 | None | None | \ref rdtesting_fpga_testbenches_manual | \ref rdtesting_fpga_testbenches_auto |
\subsection rdtesting_fpga_testbenches_requirement: Requirements
These tests are simulations and do not need any device. Vivado 15.4 should be installed.
\subsection rdtesting_fpga_testbenches_manual: Manual Test Procedure
1. Go to the fpga directory depending on which test needs to be run.
1. NoC block test benches:
Most of the NoC blocks have a test bench written in System Verilog that provides stimuli to the noc_block to verify it. The test bench for a block resides in <fpga-dir>/usrp3/lib/rfnoc/*_tb.
2. Running unit test benches:
A few sub-blocks like noc-shell and sine_tone are used within the bigger noc_blocks. They have their own test benches. Their test benches reside in <fpga-dir>/usrp3/lib/sim/rfnoc/*.
3. Radio test bench:
The radio test bench resides in <fpga-dir>/usrp3/lib/radio/noc_block_radio_core_tb/.
4. Device specific test benches:
IPs specific to a device have test benches that exist in <fpga-dir>/usrp3/top/x300/sim/*. e.g. DMA testbench, PCIe, etc.
2. Setup the environment by running 'source <fpga-dir>/usrp3/top/<device>/setupenv.sh'.
3. In the test bench directory and run the test bench by 'make xsim' or 'make vsim'.
4. All of these tests must report no failure with a 'PASS' validation. Example testbench output:
========================================================
TESTBENCH STARTED: noc_block_skeleton
========================================================
[TEST CASE 1] (t=000000000) BEGIN: Wait for Reset...
[TEST CASE 1] (t=000001002) DONE... Passed
[TEST CASE 2] (t=000001002) BEGIN: Check NoC ID...
Read Skeleton NOC ID: 1234000000000000
[TEST CASE 2] (t=000001238) DONE... Passed
[TEST CASE 3] (t=000001238) BEGIN: Connect RFNoC blocks...
Connecting noc_block_tb (SID: 1:0) to noc_block_skeleton (SID: 0:0)
Connecting noc_block_skeleton (SID: 0:0) to noc_block_tb (SID: 1:0)
[TEST CASE 3] (t=000005457) DONE... Passed
[TEST CASE 4] (t=000005457) BEGIN: Write / readback user registers...
[TEST CASE 4] (t=000006888) DONE... Passed
[TEST CASE 5] (t=000006888) BEGIN: Test sequence...
[TEST CASE 5] (t=000007403) DONE... Passed
========================================================
TESTBENCH FINISHED: noc_block_skeleton
- Time elapsed: 7500 ns
- Tests Expected: 5
- Tests Run: 5
- Tests Passed: 5
Result: PASSED
========================================================
Failing tests can be debugged by checking the waveform in a Vivado GUI by running 'make GUI=1 xsim'. More details on debugging https://kb.ettus.com/Debugging_FPGA_images
\subsection rdtesting_fpga_testbenches_auto: Automatic Test Procedure
Go to <fpga-dir>/usrp3/ and run 'build.py xsim all'. All tests should report 'PASS'.
\section rdtesting_defining Defining R&D Tests
Tests can be added any time to define procedures for pass/fail validation. Any
test must include the following:
- An unambiguous test code. This code consists of three characters that
identify the test, a short description of the devices required, and a version
suffix. Example: `GPS-X310-OCXO-v1` is a GPS-related test, requires an X310
and an OCXO to run, and is version 1 of this test.
- A manual testing procedure. This must unambiguously define a set of tasks,
and clearly identify whether or not a test has failed or passed. Tests do not
require any other defined outcome other than 'pass' and 'fail'.
- Optional, but highly recommended: An automatic test procedure. This must
consist of a command, or a script, or a set of commands that can be
automatically executed, and that will report a failure condition by means of
returning a non-zero return value.
Basic understanding of the operation of USRPs by the test operator should be
assumed when authoring test procedures. The descriptions should be as short as
possible to fully describe, unambiguously, how to reach a pass/fail conclusion.
Test procedures may be updated at any time. If this happens, a new test code
must be generated, with the version number increased. Old test codes are
considered deprecated (if there exists a version 2 of a test, version 1 should
not be run any more).
*/
// vim:ft=doxygen:
|